Forums

The Mountain View incident: He had warnings about putting his hands on the steering wheel.

The Mountain View incident: He had warnings about putting his hands on the steering wheel.

"Tesla writes in the update that the log from the accident shows that nothing was done by the driver before the collision, and that he had earlier warnings about putting his hands on the steering wheel."

zoltan | 31/03/2018

This accident could have been prevented and the Autopilot greatly improved by the following modification to the hands-off situation. Keeping in mind that the driver not holding the steering wheel may be caused by her/his INABILITY to do so, I propose that when the hands are not detected after visual and audible warning was given, the Autopilot should:
- KEEP the auto steering function, keping the car on the path
- In order to warn other drivers and eyewitnesses, start flashing all lights and beeping the horn
- After SLOWLY decelerating, stop the car in its lane
When this life-saving procedure starts, the car also should contact Tesla, reporting the emergency mode, and give the car's location to direct first responders to the scene.

mrporter6 | 31/03/2018

I was an airline pilot for 40 years and I know a few things about AP. Fiirst and foremost they are never to be trusted and constantly have to be monitored. This is why there are 2 pilots on the flight deck , ie watch the beast. This article says it all, stupid is as stupid does. https://www.wsj.com/articles/tesla-says-autopilot-was-engaged-in-fatal-c...

zoltan | 31/03/2018

i agree that supervising and, if necessary, overriding AP increases safety. But I think you agree that if this supervising function (detected by the hands on wheel in Tesla, as the currently used method) is impaired in some way, the safer action is NOT to deny/turn off AP, but to enter an emergency mode that potentially minimizes harm to the driver and other traffic participants or objects.
I don't know what happens if somehow the airplane AP detects the inactivity of both pilots, but I think continuing flying with AP AND employing other measures (alerting other crew members, radioing the emergency, etc.) would be safer than turning AP off.
As far as "stupid is as stupid does" is concerned, I want to emphasize that the driver may not be able to steer, due to a heart attack, or falling asleep, or any other disabling event. In these cases turning the AP off may make the car a "death machine", as it unfortunately happened in this case.
I feel that the current software is kind of punishes the driver, because it assumes that she/he intentionally breaks the rule of keeping hands on the wheel, instead of assuming inability and acting in a protective way.

SUN 2 DRV | 31/03/2018

Yes, he had some warning in the car that morning. But even more importantly his family said he had 7-10 prior AP anomalies at that interchange and he alerted Tesla to them. So besides the normal AP monitoring that he should been have been doing at all times, he also should have been hyper vigilant at that interchange.

Sorry to say, but this is 100% driver responsibility. Not Caltrans, not Tesla, but 100% Walter.

Benz | 31/03/2018

Why didn't the car just kept on going and stay in the lane where it was driving?

Why did the car steer to the right towards the concrete barrier?

The driver did not touch the steering wheel.

Autopilot was active.

Autopilot should have kept the car in that lane.

sbeggs | 31/03/2018

Check phone company logs time stamped for texts sent?

SO | 31/03/2018

There will probably be lawsuits coming from the family as well as from the drivers who crashed into the X.

RadOne | 31/03/2018

I don't understand why the car did not at least try to slow down. Do Tesla cars not have a crash mitigation system?

SCCRENDO | 31/03/2018

It does not matter whether AP was active or not. It’s not guaranteed to be flawless. The driver is still responsible for the control of the car

mos6507 | 31/03/2018

It's a TRUST factor. If the driver places his trust in the autopilot, and the autopilot makes a false move, then it's essentially violating that trust, defeating the purpose of autopilot as a way to reduce cognitive load. You can't do both.

That's the problem. Autopilot is basically a driving course for the neural net in which drivers are responsible for correcting anything autopilot might do wrong during the course of the trip. People don't want to look at it that way. They want to be able to take their hands off the wheel and text. It can't be both at the same time.

SCCRENDO | 31/03/2018

People want many things. I want to discover a pot of gold under the rainbow. But alas like the present state of autopilot it is not what they believe. So I guess if I believe in the pot of gold I will be disappointed.

SUN 2 DRV | 31/03/2018

"It does not matter whether AP was active or not. It’s not guaranteed to be flawless. The driver is still responsible for the control of the car"

SCC +100

The only thing that is guaranteed that the driver is responsible for control of the car at ALL times.

jeromeporter89 | 31/03/2018

The problem I have with autopilot is that if the road conditions (painted lines) prevent the system from staying on course that is one thing. If the lanes are clearly painted and the AP system still slams into a wall then the AP system is extremely flawed. I have to be honest I use AP to justify the $5000 I spend but I am also terrified to use it. Since the new 10.4 upgrade I have had more problems with the vehicle having a hard time when approaching split lanes.

SCCRENDO | 31/03/2018

I found 10.5 on the Model 3 to be a significant upgrade. Lane changing is so much more gentle. In fact the first time I tried it I thought it wasn’t changing lanes. I use AP for around 38 of my 42 mile each way commute and I love it. It’s amazing. But randomly it does some dumb and potentially dangerous things. That’s why I am always alert behind the wheel and not in the back seat drinking a couple of beers.

sourabh.damodaran | 31/03/2018

I concur with views of Zoltan. If AP is supposed to reduce the chance of a crash by 40 pct, in that case once car has flashed warning and if it detects there is no physical touch on the steering shouldn't it decelerate and stop.. there can always be an chance an individual is unable to use the steering on account of a medical condition.

apsley | 31/03/2018

Well we all want Tesla to succeed, but we must remember that someone died in this accident. So my condolences to the family, particularly the children who will now be without a father for the rest of their lives.

achilles992000 | 31/03/2018

even with autopilot, the driver is always responsible if there is an accident.

you get a warning as soon as you turn on autopilot to keep your hands on the wheel.

if it was perfect, there would be no need for this warning.
the system is clearly not fully autonomous. the driver has to be ready to takeover, if it makes an error.

scabello800 | 01/04/2018

This is a failure of autopilot. The guy trusted AP. Yes he shouldnt have but then Tesla’s garbage software shouldnt have pointed his car into a barrier. I cant believe you fanboys. Shame on you. Push Tesla to fix this broken software.

Also about “detecting hands on wheel” crap..l i sometimes will have my hands on the wheel and AP warns me to put my hands on the wheel. So i dont know if he didnt have his hands on the wheel or not.. and 6 seconds for that garbage sensor is meaningless.

Of course you would think if he is paying attention he would have braked or swerved.. so i am sure he was not paying full attention. But still AP sucks. You should not use this software if it can do these things.

I own two teslas. I like the cars and what tesla is doing. But they shouldnt lie to people.. and when they do you guys shouldnt preach the BS.

AP is not “3.7x safer than non-AP”. AP is generally used on long trips and where there is less chance of death. What moron uses AP on a road where there is risk of head-on crash and headlights are required during daytime? Nobody. That is where people die. So the stats are obviously skewed. When i use AP I am focused like a laser beam because i know the car is going to swerve.

Of course I dont expect Tesla to say their software is crap in a press release. But i am ashamed that you guys stand behind this garbage software.

brandobsb | 01/04/2018

Scabello, why not suggest a few things that could be done to improve it? You mention that you need to stay extra focused when using AP because you know the car is going to swerve. How does it swerve? What is instigating it?

You might think the software is crap, but the software is dynamic and changing/upgrading constantly based on input from thousands of cars. Just saying that it sucks doesn't do anything but make you sound like you work for Ford. No offense if you do work for Ford, I imagine it's a nice company..

I personally think the software does have a long way to go and needs to integrate new algorithms to utilize the cameras to identify stationary objects in the car's immediate path. I anticipate over the next few years AP will take on an increasing number of tasks; and will accumulate data to show it is much safer than humans at those tasks. As you say, it may be in areas traditionally 'safer' like highway driving, but humans are still pretty dumb about highway driving. Often under a number of different drugs, stressors, and physiologic factors that decrease their effectiveness as a driver.

-b

Benz | 01/04/2018

Did he rely too much on Autopilot?

How many months had he been driving in this blue Tesla Model X?

jordanrichard | 01/04/2018

Scabello, do you have more details about this accident than the rest of us.

bp | 01/04/2018

Even with hands on the steering wheel, that doesn't confirm the driver is actually watching the road. Other manufacturers have driver facing cameras that can verify the driver is looking forward at the road. Though even with a camera, that isn't a guarantee the driver is paying attention.

AutoPilot should be viewed as a student driver still learning how to drive - until Tesla indicates they believe the software is fully operating at a level significantly safer than a human driver (which is their goal), drivers must be ready to take control at any time. And if the driver isn't prepared to do that, they should not use AutoSteer.

One area Tesla should address - they should provide more documentation on what is in each software release, rather than relying on customers to figure out what has changed with AutoPilot - what areas have improved - and the areas where AP may still be risky to operate. It's likely many of the reports from customers after a new release has been distributed are incorrect - and that areas that appear to be running better may not have been changed at all in the release... This can lead to a false confidence in the software - which could increase the risk drivers aren't paying attention when the software doesn't react properly.

Remnant | 01/04/2018

@mos6507 (March 31, 2018)

<< It's a TRUST factor. >>

Then let's program it ... !!!!!

Benz | 01/04/2018

The driver would have turned the wheel to the left during the last few seconds in order to avoid a collision with the concrete barrier, IF HE WAS PAYING ATTENTION AND IF HE WAS ABLE TO DO SO.

He probably had too much faith in Autopilot.

RedShift | 01/04/2018

Fault is with both the AP and the driver. Drive, because he knew there was a potential issue at that particular section of the road. I don’t know why he left the control to the AP knowing what he knew.

AP, because it simply failed to perfor as promised. Tesla will get called out, and deservedly. I have always maintained that AP systems are a mad pursuit that is likely to be several years before they can be trusted. This is one more setback. My advise to Tesla would be to say ‘it will be ready when it is ready’, stop promising unrealistic FSD drives coast to coast, test it thoroughly before deploying.

Any system will likely have faults and can never be 100% safe. But this accident is something that should not have occurred, given the lane markers.

matthewtesla | 01/04/2018

Lets use a little reason here when it comes to the driver telling family members about a specific spot on the freeway where the Tesla is trying to veer towards. It does not take a strong imagination to point out that someone could lie about this when they are intending to sue the company. Tesla is also coming out saying that they brought the vehicle in for Navigation Issues and nothing else. Saying its the AP's fault is like saying its the computers fault when your typing a paper and the electricity cuts out, you lose your information but you knew it was a possibility. Users are blasted with warning about autopilot and Tesla has only increased this to the extent of checking for the users hands often. The driver is a 100% at fault IMO.

SUN 2 DRV | 01/04/2018

scabello800:

Yes the software is what it is and Tesla should work to make it better, and of course they are.

Nevertheless, the DRIVER is 100% responsible for control of the car, 100% of the time.

KD7UFS | 01/04/2018

zoltan had the right idea, the software should be prepared not only for a driver that is not paying attention, but also for a driver who is unable to pay attention. Holding the vehicle in the lane, flashing lights and sounding horn, and bringing the vehicle to a stop sounds a lot better then turning off. Also, all these things are more likely to get the desired reaction from a driver then, "ho hum, I need to re engage the autopilot again". One thing to consider for the future, how easy should it be to override the autopilot. I don't know any pilot that would fly a plane when he could not override the autopilot, but someday we are going to have cars that are more likely to be right then the driver, especially if the driver has just had a heart attack.

jordanrichard | 01/04/2018

KD7UFS, it takes .5 second to “override” the AP. You can either kick it off by tapping the brake or giving a firm tug on the wheel. The tug on the wheel will kick off the auto steer function. Tapping the brake will kick off both the auto steer and the TACC.

SacTesla | 01/04/2018

I read he had it set at 1 the closest setting. What speed was he going? And what speed was the autopilot set at? I use my AP 90% of time on highway but use 3 for distance and stay aware of traffic and turning

Remnant | 02/04/2018

@bp (April 1, 2018)

<< ... [Tesla] should provide more documentation on what is in each software release, rather than relying on customers to figure out what has changed with AutoPilot - what areas have improved - and the areas where AP may still be risky to operate. >>

Perfectly reasonable!

But shouldn't we address the "reasonable" as well?

Consider this #zoltan post... "the driver may not be able to steer, due to a heart attack, or falling asleep, or any other disabling event. In these cases turning the AP off may make the car a "death machine", as it unfortunately happened in this case. ... I feel that the current software ... kind of punishes the driver, because it assumes that she/he intentionally breaks the rule of keeping hands on the wheel, instead of assuming inability and acting in a protective way." Very pertinent, I'd say.

The very fact that drivers fail to abide by the recommended action, suggests they were ill prepared to do so. Hence, the AP programmer should – nay, MUST – NOT count on it.

DTsea | 02/04/2018

what zoltan suggested is what Tesla says the autopilot already does. there was a case of a man who died at the wheel of a heart attack. AP slowed, stopped, put on flashers

Tesla-David | 02/04/2018

I have been using AP for past two years in our S85D, and set the distance at 5 to give extra margin of safety. I think setting to 1 was taking a big risk. I am always cautious when using AP as it is BETA and only a driving assist at this point. Anyone who thinks otherwise is crazy. I feel for the family, but this accident was clearly the fault of the driver not paying attention, and accepting the limits of the AP. If he had previously had experienced problems with AP in the same location WHY would he engage AP during this stretch?

whodat | 02/04/2018

Does anyone know where exactly in Mountain View southbound on 101 the accident occurred? I sometimes use autopilot on my MS90D and drive in Mountain View and want to know if there is some external condition that causes autopilot to possibly misbehave (bumpy road, highway painted lines, etc.

I've been using autopilot for 2 years and autopilot is not perfect, but under certain conditions it seems to work, like stop and go traffic, straight sections with flat terrain and well painted highway markers. Bumpy roads that cause the car to go up and down in a oscillatory manner seems to confuse autopilot and it sometimes loses lock and cannot track the painted highway markers.

I have also observed that with both hands on the steering wheel with autopilot engaged, I sometimes have had the system tell me to put my hands on the steering wheel.

Autopilot should be used with caution and needs to be constantly monitored, since its not perfect and can do random things.

When I bought my Tesla, I asked what are some of the autopilot weaknesses and the Tesla sales manager explained about the bumpy road weakness.

I'm a bit surprised there is not training for using autopilot. Maybe someday in the future there will be required training for using autopilot?

RadOne | 02/04/2018

Does anyone know if there is a forward collision crash mitigation system? I can't seem to get anyone to respond to this. Seems like something should have regardless if AP. This should be standard.

SUN 2 DRV | 02/04/2018

Whodat: Yes, it's very clear where the accident occurred.

On southbound 101, at the gore point where the left lane exits to become southbound Hwy 85. The driver drove his car into the stationary divider. That point typically has an attenuation barrier, but it had been destroyed in a separate recent accident.

Tesla-David | 02/04/2018

@RadOne, the short answer to your question is yes. Why it did not engage is an issue I am sure Tesla is looking into. My collision avoid system has kicked in several times to prevent accidents in my S85D, and there is a link in Model S thread I saw last year of a Model X stopping short of an accident on the Autobahn avoiding a bad crash. The car in front of the MX hit a car stopped on the highway and flipped over, but the MX saw the stopped vehicle and hit the brakes avoiding the accident.

whodat | 02/04/2018

Thanks SUN2DRV. I know exactly this exit. I generally do not use autopilot for this type of freeway change going from one freeway to another. Too much has to work correctly. There is additional confusion factor if the driver in front of me decides to suddenly change lanes or has taken the wrong exit, one needs to be prepared with little notice.

Rumi11 | 02/04/2018

This was such a tragedy, but a lot of the ensuing statements from the family don't add up. Of course they are gearing up for a lawsuit, but Tesla should not be held responsible. If the driver had paid attention as he should have, he would still be here today. Technology comes with responsibility and when it comes to driving, technology may pose a risk. This is something every driver knows and accepts upon entering a vehicle.

georgehawley.fl.us | 02/04/2018

Every digit for the TACC following distance setting is equal to approximately .5 seconds. The average time it takes for an alert driver to 1. Recognize imminent danger, 2. Decide what evasive action to take and 3. Take the action is 1.5 seconds. This doesn't allow for the aftermath--skidding etc. I use 6. Of course I am 80 years old and need to be conservative. A setting of 1. Is asking for trouble.

Take a look at the video of the lane split that the car was negotiating in this blog:
https://youtu.be/gKyjA9VUj-s Start at about 3:40 minutes into the blog post.

The space between the exit lane and the left hand lane of the continuing roadway gradually widens with a lane boundary on each side leading to the concrete barrier. The Autosteering could have easily accepted that space as a lane to follow. The driver was very familiar with the road and the shortcomings of Autosteering. With hands off the wheel, TACC set at 1, relying on Autosteering to negotiate an ambiguous condition at highway speed this poor soul was inviting trouble. The misfeasance of the CA DOT made sure that, when he found it, he would pay the ultimate price. RIP

Remnant | 05/04/2018

@georgehawley.fl.us (April 2, 2018)

<< With hands off the wheel, TACC set at 1, relying on Autosteering to negotiate an ambiguous condition at highway speed this poor soul was inviting trouble. >>

However, accountability is clearly not just a binary AP/Driver choice here, but a multi-factorial complexity, shared by the Designers of this peculiar Hwy Exchange, the Hwy Maintenance Agency, the AP Program, and a Driver probably Impaired in some way.

Indeed, retrospectively, this 101/85 Hwy Exchange became a death trap, as if designed that way.

SUN 2 DRV | 05/04/2018

I've taken this exact interchange every weekday for over four years. It's actually a very simple and straightforward exit. No tricky nor sharp maneuvers required, as long as you realize that the left lane is going to exit. There are huge overhead signs ahead of this exit proclaiming that the left lane becomes Exit Only in bold yellow to make it stand out from the normal green lane advisory signs.

The only reason for accidents here is that people just aren't paying attention to what lane they should be in until the very last minute. (Or sometimes people intentionally stay in the wrong lane until the last minute if it's flowing faster than their intended lane.)

I can see where AP might have problems discerning the lane markings, but an attentive human should have no problem what-so-ever safely traversing this interchange.

Remnant | 07/04/2018

@SUN 2 DRV (April 5, 2018)

<< It's actually a very simple and straightforward exit. No tricky nor sharp maneuvers required, as long as you realize that the left lane is going to exit. >>

No, it's not simple, The "exit" is actually a road bifurcation stretching over at least half a mile and reaching a lane width at the gore end of the concrete separator provided with a collision attenuator destroyed the day before by a collision.

Whoever put the attenuator there, thereby acknowledged the risk of collision and tried to mitigate it.

Here are a few thoughts concerning the AP and the driver approaching this potentially deadly arrangement placed at this "very simple" interchange.

(1) Can you fathom any reason why warnings like a zebra paint pattern or rumble strips were not placed there AHEAD of the collision attenuator, rather than two orange cones?

(2) For the AP to avoid a collision at this collision-prone interchange, would be to have info regarding the topography and the history of it. The AP programmer should have built into it access to the accident data base of the location and online monitoring of the approach to it.

(3) The AP should have been able to summon Means of alerting the Driver to the imminent possibility of collision.

(4) If the Driver fails to respond, he should be overridden by Emergency Braking and Light Blinking/Flashing.

(5) Note that the Driver, despite being an IT professional, had prior difficulties at that location, which indicates a disability of some sort, whether temporary or permanent.

Conclusion is that dismissing the AP and Hwy Maintenance accountability and holding the deceased Driver fully accountable for the accident is a very poor resolution of this conundrum.

SCCRENDO | 07/04/2018

I agree with Remnant to a certain degree that more can be done by department of transportation regarding road safety and by the makers of EVs. But remember this is all still Beta and the prime responsibility rests with the driver. The driver needs to take over at any moment. In the absence of EAP who would be “blamed” for the crash? Unfortunately it would still be the driver. EAP is a driver assistance tool.

Yodrak. | 08/04/2018

"Can you fathom any reason why warnings like a zebra paint pattern or rumble strips were not placed there AHEAD of the collision attenuator, "

Time, money, and a cost-benefit analysis.

"Note that the Driver, despite being an IT professional, had prior difficulties at that location, which indicates a disability of some sort, whether temporary or permanent."

Not exclusively. I could also indicate not paying attention for some reason. My opinion is that not paying attention is the more likely reason.

Remnant | 08/04/2018

@Yodrak. (April 8, 2018)

<< I could also indicate not paying attention for some reason. >>

Well, Yody, that IS a disability as well ... !

Yodrak. | 08/04/2018

"Well, Yody, that IS a disability as well ... !"

Yes, a self-inflicted one.

As the Tesla manual states repeatedly, "Failure to follow these instructions could cause serious property damage, injury or death." Hopefully other drivers will learn from this driver's error and his failure to use EAP as it is presently intended to be used, and more lives will be saved than are lost.

s.mashoodali121 | 08/04/2018

Watch this.How i made a simple electric motor from home find products
https://www.youtube.com/watch?v=Rw-Yhr8KRrY

subscribe https://www.youtube.com/channel/UCTY4KwZNFutwq8prjSgRXsQ

Remnant | 09/04/2018

@Yodrak. (April 8, 2018)

<< Yes, ["not paying attention for some reason" is] a self-inflicted [disability]. >>

Unfortunately, your statement is a ghastly cop-out: it amounts to exonerating the very AP alleged bypass of human error on the basis of human failing to salvage AP from its errors.

This not just irrational, but it may be dishonest as well.

SamO | 09/04/2018

Unfortunately, your rambling post is a word-salad of incomprehensible garbage, per usual.

Pages