[HN Gopher] MIT study finds Tesla drivers become inattentive whe...
___________________________________________________________________
MIT study finds Tesla drivers become inattentive when Autopilot is
activated
Author : camjohnson26
Score : 77 points
Date : 2021-09-21 19:41 UTC (3 hours ago)
(HTM) web link (techcrunch.com)
(TXT) w3m dump (techcrunch.com)
| [deleted]
| slownews45 wrote:
| Did MIT study whether the "inattentive" drivers had higher
| accident rates and fatality rates than other drivers?
|
| That should be the question.
|
| "MIT study shows folks drive 5MPH faster with seatbelts on" -
| doesn't mean we should get rid of seatbelts if overall damage and
| accidents / injury rates are reduced.
|
| Im serious - do the study. Put someone in a 6 hour drive or
| simulator for an SF to LA drive or something, one in a car with
| tesla's drivers aids one without. Then have something happen
| (sudden braking / someone cutting in front of you etc) on drive
| in half of the trips.
|
| Who crashes? The "inattentive" tesla driver? Or the other driver.
|
| Based on my own observation, there are drivers in regular cars
| texting, looking up directions, playing with their phone maybe
| for music? I've seen people putting on makeup, sitting at a light
| for 5 minutes on their phone (when it is green) etc. I've seen
| people drift out of their lanes crazily. These are folks in
| normal cars. I've seen folks speeding and swerving in and out of
| traffic, some without using blinkers etc.
| kirse wrote:
| I ride a motorcycle and can pretty easily spot patterns of
| distracted driving, but after buying a truck with some ride
| height, it is amazing how many people are regularly on/touching
| their phone for something while driving. On the order of 50-75%
| in a city environment. After this recent revelation I'm almost
| more amazed at how few accidents there are given the sheer
| number of cell phone addicts.
| zionic wrote:
| This. Every study suggesting holding a cellphone is equally
| risky to drunk driving must be false.
|
| I rode a commuter bus for years and saw the exact same thing
| you describe, and no one would seriously suggest 50-75% of
| drivers are (equivalently) drunk.
| bumby wrote:
| > _Every study suggesting holding a cellphone is equally
| risky to drunk driving must be false._
|
| Or drunk driving isn't as dangerous as previously thought.
|
| Or there's many more drunk drivers than previously thought.
|
| I don't necessarily agree with either of those, but the
| point is there are many possible explanations and we should
| be careful about jumping to conclusions.
|
| A better way of refuting those claims is to find flaws in
| their methodology
| lamontcg wrote:
| You can't just compare attentive Tesla drivers (where you're
| probably projecting your own self onto this driver) with the
| worst non-Tesla drivers.
|
| You have to compare the worst Tesla drivers to the worst
| drivers in regular vehicles. Average Tesla drivers to average
| drivers of regular vehicles, etc.
|
| And I'm not sure a simulator is the best comparison.
| [deleted]
| Kydlaw wrote:
| The study doesn't say if it's bad or good. They just say that
| attention decreases with Autopilot is on.
|
| The question you ask is completely different and thus, not
| brought into the article.
| mwint wrote:
| I've only driven a Tesla with Autopilot once, for ten minutes
| or so. By the end of it, I was struck by how much more time I
| was spending looking far ahead, or looking in the mirrors for
| longer. It felt like I switched to farther-field situational
| awareness once I was confident the car wouldn't rear end the
| dude in front of me.
|
| So if attentiveness is measured as "staring at the license
| plate in front of me", my attentiveness went down. But I felt
| safer since I was looking hundreds of yards down the road for
| situations I might have to handle or avoid, at a longer
| timescale.
| sjburt wrote:
| This is how I feel as well (having driven many miles with a
| Comma), but I have no idea if I represent an average driver
| or if my perceived attentiveness matches what is really
| happening.
|
| I do think that it is very easy for me to spend longer
| looking at a screen or button when the autopilot is driving
| for me, since there is no immediate penalty (lane line,
| rumble strips, etc) for looking away.
| ummonk wrote:
| You don't need autopilot for that though - I do the same in
| my car with basic traffic adaptive cruise control.
| structural wrote:
| Do people really stare at the license plate in front of them
| anyways, though? It was made quite clear during driver
| training courses many years ago not to do this and that your
| glance pattern should always start and return to near the
| horizon (whether that be hundreds of yards in a city, or
| miles away on a highway).
|
| I've always had the opposite problem, if I'm not maintaining
| situational awareness to the horizon, I'll inevitably fall
| asleep after an hour or two out of sheer boredom. I prefer to
| drive long distances in manual-transmission vehicles (there's
| more to do!), and commercial vehicles with loads requiring
| careful management of engine RPM are even better. Those I can
| drive for 10-12 hours in a day without issue. I never saw
| myself as anything other than an ordinary driver.
|
| In my case, a Tesla autopilot is the worst possible
| compromise. It removes enough attention requirements that it
| feels much harder to stay attentive, but without automating
| the process completely. Other people I've talked to that
| operate large pieces of mechanical equipment have often said
| similar things about automation that it's best to either be
| a) fully manual or b) fully automated for some duration of
| time.
| mcguire wrote:
| " _Individual glance metrics calculated at the epoch-level and
| binned by 10-s units of time across the available epoch lengths
| revealed that drivers in near-crashes have significantly longer
| on-road glances, and look less frequently between on- and off-
| road locations in the moments preceding a precipitating event
| as compared to crashes. During on-road glances, drivers in
| near-crashes were found to more frequently sample peripheral
| regions of the roadway than drivers in crashes. Output from the
| AttenD algorithm affirmed the cumulative net benefit of longer
| on-road glances and of improved attention management between
| on- and off-road locations._ "
|
| Then, there's
| http://www.trb.org/Publications/Blurbs/171327.aspx, but I
| haven't given them my address. (https://pdf.sciencedirectassets
| .com/271664/1-s2.0-S000145751...)
| notshift wrote:
| Tesla publishes data on accident rates with and without
| autopilot, and the rate with autopilot engaged is lower. The
| problem is, that data is mostly meaningless because people
| don't use autopilot in conditions when an accident is most
| likely to occur.
|
| I suppose you'd have to compare against only manual driving in
| good conditions to make a fair comparison. Not sure that data
| exists anywhere.
| zarkov99 wrote:
| Of course they do, that is the point of AP. The question that
| matters is if the drivers are safer, to themselves and others,
| than they would be without AP. From what I have read this is
| undoubtedly true.
| sjg007 wrote:
| I think your brain just turns off since it doesn't have to manage
| the task anymore. I mean keeping your brain engaged, for example
| when your spouse or someone says don't be a backseat driver is
| stressful and you really have no control over it. So I imagine if
| autopilot is good enough I would definitely relax. The issue with
| that is you have to be quick enough intervene if the car goes off
| the rails.
| saltmeister wrote:
| what a surprise
| crackercrews wrote:
| I saw a friend 'drive' with autopilot, and it was scary. He was
| basically dozing at 80 MPH. Made me rethink my aspiration to have
| such a vehicle.
| elisharobinson wrote:
| i went through the article .. read the paper it was based on ...
| saw the data they sited .... saw the conclusion they reached ...
| was severely worried of the state of academia and what qualifies
| as "research"
|
| >> the only significant conclusion this paper had was a driver
| was more than 22% likely to look at the "down and center stack"
| when on AP
|
| >> the paper was based on data found on this
| https://ieeexplore.ieee.org/document/8751968 paper author by none
| other than my favourite podcaster D Lex Friedman . the study is
| conducted on 25 cars all either model S/X from 2016-2019 AKA pre
| fsd hardware and driver monitoring.
|
| >> now tesla has a big screen on the center console and i would
| find it natural to see the navigation when im not driving. BUT
| all other aspects of driving were not that far from normal ...
| turns out people do all sorts of dumb stuff with or without AP
|
| >> this took me less than 15 mins of googling and i was baffled
| by how something so stupid could go past even the most bare bone
| peer review . and then i found about "The Advanced Vehicle
| Technology Consortium (AVT)"https://agelab.mit.edu/avt the
| sponsor of some of the co-authors . Which curiously has major car
| companies and suppliers listed as partners except for tesla . i
| dont have the inside info on how money changes hands in academia
| ... but the whole thing looks pretty bad
|
| personally i am biased towards Tesla (aka tesla fanboy) . But i
| feel there is enough wrong here to justify my case that this was
| a academia sponsored hit piece not to different from climate
| change deniers or tobacco companies.
| mehrdada wrote:
| It is not clear to me that the metrics of attention people
| traditionally use are optimally desirable when evaluating
| Autopilot-like systems. Anecdotally, I have found myself not
| staring forward anywhere as much, but in fact I am more attentive
| at what other drivers are doing and taking control where
| necessary--in effect delegating the lowly work to the machine. I
| do feel this is much safer than constantly forcing me to look at
| the road in front of me or touching something.
| nojito wrote:
| Given that there really is scant proof that backup cameras work
| in keeping people safe, the hypothesis that autopilot makes
| people safe is shaky at best.
| romwell wrote:
| We can test this hypothesis by looking at fatalities per mile
| driven, aaaaaaaand humans are still better drivers at this
| point.
| belltaco wrote:
| Source?
| droopyEyelids wrote:
| Heres a biology point: accurate self-assessment is extremely
| difficult for the human brain, because the ability to assess
| yourself is one of the first things to suffer a loss of
| performance with external stimuli or sleep deprivation.
|
| All you can really say is that you _feel_ more attentive.
| elif wrote:
| Some situations are fully-qualifiable IMO. For instance, on
| those 80mph highway sweeping curves, my car is nailing the
| centerline better than i possibly ever could. As a result i'm
| glancing at the cars around me, both visually and using the
| 3D model, rather than focusing on where my tires go.
| romwell wrote:
| Yeah, and nailing the curves does not equal more attention.
|
| Knowing where the tires are going is also important, like
| in that case where a Tesla accelerated into a highway
| divider.
| Enginerrrd wrote:
| Yeah there's a reason that forward attention is the
| default. Spreading attention to lower probability and
| lower gravity issues may not be optimal, even if it feels
| better. I'd guess it's riskier than maintaining attention
| in the direction of momentum. But it depends on the
| statistics of autopilot failure.
| elif wrote:
| words are a terrible tool for communicating expanded
| awareness. I can only try to do my best.
|
| Attempting to characterize the system through describing
| its exceptions is not really accurate.
| reissbaker wrote:
| If you want data on whether Autopilot is harmful or
| helpful, you could take a look at crash statistics for
| highway Autopilot-on miles vs highway miles of average
| vehicles. The Autopilot-on miles are much better than the
| American national average.
|
| Visual behavior pattern changes are a microbenchmark that
| may not even be measuring attention at all. Regardless,
| the pattern changes don't seem to increase crash risk as
| compared to not using Autopilot in any study that I'm
| aware of.
|
| If there is data supporting a claim that Autopilot
| actually increases crashes, I would be curious to see it.
| "A Tesla crashed on Autopilot once" is just an anecdote;
| humans crash cars constantly, and the question is whether
| Autopilot results in greater or fewer crashes.
| Retric wrote:
| And a huge number of people have done the same thing and
| completely ignored their surroundings while using AP.
|
| The question isn't if people reading a book are capable
| of using auto pilot safely, the question is if people can
| supervise level 3 self driving systems or if they need to
| evaluated assuming generally inattentive drivers.
|
| Counterintuitively, the less people have been paying
| attention when AP is engaged the closer it is to being a
| level 4 system. Aka if people are catching 90% of
| potentially fatal mistakes then in level 4 it would be
| 10x as deadly. Alternatively, if their catching 10% of
| potentially fatal mistakes it would be almost as safe as
| a level 4 system.
| Buttons840 wrote:
| The GPs core argument is that traditional markers of
| attentiveness may not be appropriate when it comes to
| autopilot.
|
| You then argue that self-assessment is inaccurate. You're
| right, but it just doesn't seem relevant.
| lamontcg wrote:
| The GPs is supporting that core argument with anecdotal
| evidence from self-assessment. Dunno how you can say that's
| not relevant to point out, particularly since you agree.
| mehrdada wrote:
| The self-evaluation aspect is kind of beside the point. The
| core question is given a fixed amount of "attention", do you
| want to spend most of it staring forward to keep yourself in
| the lane and avoid rear-ending the next car? Current Tesla
| Autopilot does that particular task almost perfectly today,
| on highway at least.
| romwell wrote:
| Given a fixed amount of attention, I do want to spend most
| of it on making sure my car is not about to crash into an
| obstacle, like another car, or highway divider.
|
| Because Tesla requires people to do that, as there are no
| guarantees that the autopilot won't [1].
|
| [1] https://www.paloaltoonline.com/news/2020/02/25/ntsb-
| teslas-a...
| oliwary wrote:
| This seems sensible, but I do wonder whether actively
| steering the car is as attention intensive as merely
| verifying that the car is steering correctly.
|
| This is pure speculation, but it feels like humans have a
| very robust model to determine when something is about to
| go wrong (i.e. this car is about to crash into an
| obstacle). If it takes less attention to determine that
| the car steers correctly as opposed to steering the car,
| this would leave more attention available to, for
| example, monitor other cars behavior.
| romwell wrote:
| >This seems sensible, but I do wonder whether actively
| steering the car is as attention intensive as merely
| verifying that the car is steering correctly.
|
| There is a lot of research explaining why it's true,
| particularly when it comes to piloting airplanes.
|
| The TL;DR is that doing something _with your hands_ ,
| involving touch, engages more of your brain.
|
| >This is pure speculation, but it feels like humans have
| a very robust model to determine when something is about
| to go wrong
|
| Yes, this is pure speculation, and also incorrect.
|
| Again, if you were to look things up instead of
| speculating, you'd quickly find plenty of information.
| mwint wrote:
| It's akin to sitting in the passenger seat of a car - you
| can be idly looking out the windshield, not really
| thinking about driving, and your brain will freak out the
| moment the driver doesn't do what you expect. It's like
| my brain always has a predicted path and velocity, and
| deviation from that is so easy for me to detect I do it
| subconsciously.
|
| Maybe other people aren't like that, but I certainly am.
| I find myself wishing there was an FSD-like depiction of
| what's going on in the driver's brain, to answer the "you
| _are_ planning to stop eventually, right?" question.
| [deleted]
| [deleted]
| [deleted]
| spike021 wrote:
| >Anecdotally, I have found myself not staring forward anywhere
| as much, but in fact I am more attentive at what other drivers
| are doing and taking control where necessary
|
| This honestly sounds exactly why it's a bad thing. It's not
| just cars to worry about ahead of you.
|
| Have you never driven anywhere that people didn't just
| spontaneously cross the road outside of a crosswalk? Especially
| at night? Not wearing anything remotely reflective or easy to
| see?
|
| I've seen this happen on the freeway too. Guy wearing
| beige/dark green jacket, tan pants who should not have been
| there, yet was being dangerous to any drivers out on the road.
| Broad daylight. If I hadn't been looking ahead I would've
| missed him and at least thankfully in my scenario he didn't
| enter my lane.
|
| While it's extremely important to (as my dad labeled it) "drive
| for other drivers" (as in, observe cars in your surroundings
| and anticipate any dumb actions they may make), that does not
| mean you should take focus off everything else, not least of
| which things you'd see by "staring forward" as much as
| possible.
| bertil wrote:
| What was the control for that observation? Before and after
| engagement or arriving on a highway with and without Auto-pilot?
| antattack wrote:
| The study did not measure attentiveness but rather it was
| recording driver glances 20s before disengagement and compared it
| to 10s of glances after.
|
| When one disengages AP, there's normally a reason for it and that
| reason often is that one anticipates to make more complicated
| maneuvers or getting of the highway.
|
| Either way, environments on AP and after are different, hence
| they require different glance pattern, therefore one cannot
| conclude that driver paid less attention, just that it was
| different.
| sklargh wrote:
| MIT finds water is wet.
| romwell wrote:
| "MIT finds water is wet; HackerNews disagrees"
| xyzzy21 wrote:
| No! Really?
|
| LOL. Of course they are!
| dmix wrote:
| This is why a Comma.ai style system putting a camera facing the
| driver to make sure they are paying attention and awake is
| essential. Even ignoring self-driving a big number of accidents
| happen when people fall asleep at the wheel.
|
| I'm not sure whats stopping them. It can be done locally to avoid
| privacy concerns.
|
| It also has better UX than the weird wheel touch sensors on some
| vehicles where you have to be touching the wheel at all times
| when activated (I haven't decided if that make sense as a
| requirement, leaning towards no).
| elisharobinson wrote:
| the goal is no wheel , just set a destination and put your
| seatbelt . From that perspective it would make sense to put
| least amount of effort on driver monitoring.
| cma wrote:
| That's like saying if the eventual goal is realistic
| telepresence through BCI, we should just go ahead and remove
| the seatbelts to quit wasting effort on things that won't be
| in the final vision.
| Rd6n6 wrote:
| We are talking about total surveillance of a driver in exchange
| for better cruise control?
| dmix wrote:
| I'm sure your phone is plugged into your car wherever you go
| like everyone else. But no one claims total surveillance.
|
| Cars already come with microphones for voice commands.
|
| Like I said it could be local and isolated.
| teawrecks wrote:
| It's either that or they have to be hyper-attentive, constantly
| on the ready to catch a dumb mistake that the autopilot might
| make, which would be more stressful than just driving it
| yourself.
| WalterBright wrote:
| "Researchers studying glance data from 290 human-initiated
| Autopilot disengagement epochs found drivers may become
| inattentive when using partially automated driving systems."
|
| Next up, researchers discover that people like ice cream.
| throwawayboise wrote:
| Also, water is wet.
| coding123 wrote:
| I don't know about that, the consensus on that seems to be
| leaning towards water NOT being wet.
|
| https://www.debate.org/opinions/is-water-wet
| Kydlaw wrote:
| It could have been with the study wasn't conducted with the
| measures to prevent attention drop activated.
| yawboakye wrote:
| When academic studies are dedicated to trivial stuff that is
| either obvious to an unscientific mind (saying this to calm you
| down) or would be easily debunked in another study, and worse,
| from an institution inextricably linked to science, is there any
| wonder that when they come out and say "masks are safe" many
| unscientific minds (doubling down to still make you feel good)
| don't give a hoot?
|
| But I guess the days when academic study (time, funds, energies,
| raw intellect) was spent on more pressing issues are behind us
| and we'd have to deal with these types of studies going forward.
| I'm so looking forward to the MIT study that finds that most of
| HN comments are not read.
| mtreis86 wrote:
| How does it compare to cruise control? I can't turn that on
| without feeling less attentive.
| rvz wrote:
| > The point of this study is not to shame Tesla, but rather to
| advocate for driver attention management systems that can give
| drivers feedback in real time or adapt automation functionality
| to suit a driver's level of attention. Currently, Autopilot uses
| a hands-on-wheel sensing system to monitor driver engagement, but
| it doesn't monitor driver attention via eye or head-tracking.
|
| Precisely. I said this before and now this study supports my
| previous points even further [0]. Using the wheel to monitor
| driver attention is not even close to good enough to determine if
| the driver is paying attention or not. Comma.ai seems to be able
| to implement eye-tracking for their driver monitoring system.
|
| I'm quite surprised that Tesla continues to lack such a system to
| monitor the drivers eyes on the road when they are behind the
| wheel especially when the user is using autopilot or not.
|
| [0] https://news.ycombinator.com/item?id=28208921
| falcolas wrote:
| > I'm quite surprised that Tesla continues to lack such a
| system to monitor the drivers eyes on the road when they are
| behind the wheel especially when the user is using autopilot or
| not.
|
| Perhaps we shouldn't expect that of people? If you've taken the
| vast burden of driving off their minds, we should accept that
| human nature means they'll stop paying attention to driving.
|
| That, however, explicitly implies that automation needs to be
| much better before it can be used.
| elif wrote:
| > I'm quite surprised that Tesla continues to lack such a
| system to monitor the drivers eyes on the road when they are
| behind the wheel especially when the user is using autopilot or
| not.
|
| You are surprised for no cause. Tesla uses exactly that system.
|
| https://www.notateslaapp.com/software-updates/version/2021.4...
| elif wrote:
| since i'm being downvoted for this?? here's a link describing
| it??
|
| https://www.notateslaapp.com/news/565/tesla-rolling-out-
| driv...
| Lendal wrote:
| Especially since there's a hardware camera built in to every
| Tesla, right at the driver's eye level. I'd rather have that
| system than the wheel sensor. Accidentally applying too much
| force on the steering wheel disables the autopilot while too
| little force doesn't register. It's annoying, trying to get the
| amount of Goldilocks force just right. But the eyes don't lie.
| elif wrote:
| pro-tip: either of the scrollwheels, volume adjustment or
| speed adjustment, are considered driver inputs. I typically
| go +1 kph, -1 kph its not even noticable
| romwell wrote:
| Pro-tip #2: if you insert the belt buckle into the clip
| before getting into the seat, the system thinks you have
| buckled up, it's not even noticeable
| elif wrote:
| I guess i have to explain that applying torque to the
| wheel is less safe than adjusting your volume. My tip is
| a safety tip, your joke doesn't really apply?
| nemothekid wrote:
| > _I 'm quite surprised that Tesla continues to lack such a
| system to monitor the drivers eyes on the road_
|
| I'm not sure if it's documented anywhere but I think the 2021
| Tesla models do this [1]. That said, I'm not sure if this is
| enough - this is the same issue Google faced which led them to
| believe they had to go L4 of bust.
|
| [1] https://www.theverge.com/2021/5/27/22457430/tesla-in-car-
| cam...
| sschueller wrote:
| We should absolutely shame Tesla. Their marketing is selling
| "Full Self Driving" and have many times in the past hinted that
| autopilot is something it isn't. Elon claiming that your car
| will be a RoboTaxi in - 2 years also doesn't help and makes
| people think autopilot must be quite good.
|
| How is a regular Joe supposed to know?
| moduspol wrote:
| The multiple warnings and the dialogue that explicitly
| requires confirmation might be a good hint.
|
| But I think we all know this problem hasn't materialized in
| crash data, because if it did, that's what we'd be talking
| about. Instead it's just more FUD about the names of
| features, some of which are explicitly not yet delivered.
| romwell wrote:
| The problem didn't materialize in crash data, eh?
|
| It absolutely did.
|
| Why we're not talking about it is another issue.
|
| https://www.paloaltoonline.com/news/2020/02/25/ntsb-
| teslas-a...
| mwint wrote:
| There's a difference between instances and data. GP is
| saying that if someone had a damning graph of large-scale
| statistically-significant data showing AP is crashing
| more, we'd be talking about it.
| romwell wrote:
| We do have this data.
|
| Here are your damning statistics: self-driving cars have
| a _higher_ rate of accidents and fatalities per mile
| driven than human-driven cars [1].
|
| [1]https://www.natlawreview.com/article/dangers-
| driverless-cars
| moduspol wrote:
| Are Tesla's cars with Autopilot involved in fatal crashes
| at a notably higher rate than comparable cars? If the
| answer were "yes," you wouldn't be pointing me to
| anecdotes. You'd be pointing me to that data.
|
| The lack of that data is why, despite years, hundreds of
| thousands of vehicles, and millions upon millions of
| miles driven, people are still blaming the names of
| features.
|
| It's time to move on. Over 100 people die in the US per
| day in traffic accidents. How many years need to pass
| before a handful of anecdotes and speculation doesn't
| pass as valid criticism?
| stefan_ wrote:
| Yes, why do we care that Boeing withheld crucial
| information about the MCAS? Is two crashed planes a lot
| compared to other plane models? Why do we determine the
| root cause for every crashed plane, there aren't a lot of
| those anymore?
|
| There is statistics, and then there is addressing obvious
| design flaws. The latter is worth doing a lot more than
| the former.
| romwell wrote:
| >Are Tesla's cars with Autopilot involved in fatal
| crashes at a notably higher rate than comparable cars?
|
| The answer is yes [1].
|
| >It's time to move on
|
| That time will be when the self-driving cars can safely
| elf-drive -- which is not now.
|
| [1]https://www.natlawreview.com/article/dangers-
| driverless-cars
| aledalgrande wrote:
| we did not need an MIT study to know this haha
___________________________________________________________________
(page generated 2021-09-21 23:01 UTC)