Auto pilot fatality

Auto pilot fatality

Driver of Tesla on auto pilot killed in crash, investigation underway. LA Times reports.

NVJoule | 30 juin 2016

Any further information?

Triggerplz | 30 juin 2016

A tractor trailer turned in front of a S on a divided highway if you read the headlines of the article you would think it was the Tesla auto pilot at fault WRONG come on media stop with the BS it was the tractor trailer drivers fault

bak_phy | 30 juin 2016

It's clear that the truck was at fault. The question is wether a human driver would have been able to avoid it at least lessen accident.
In any case, AP was being used where it wasn't meant to be and required extra caution.

michael | 30 juin 2016

A human lessen the accident? Who knows? Maybe? Worsen? Well, what's worse? Avoid the accident, sounds like definitely not. The press is going to have a field day.

In my limited experience, autopilot has alerted or avoided more "minor" accidents than if I wasn't using it. Especially in that first week or two when you are still poking around way too much on the touch screen while driving.

Redmiata98 | 30 juin 2016

This is bubbling on the TMC forum but appears to have happenned back in May. There is also discussion tied to tjis thread about the upcoming version 8 projected changes.

rdalcanto | 30 juin 2016

This shows clearly who the trolls are. Only a troll would blame autopilot. Autopilot is NOT self driving. The 2017 Mercedes has a similar system to Tesla, although according to reviews, the Mercedes system is not as good. Both are driving aids. If someone suddenly appears in front, blocking the road, the driver has to try to avoid the collision. Autopilot does not have decision making and accident avoidance. The instructions are very clear that the driver is in charge. In a few years, when self driving cars become available, people can blame the system if it fails to avoid an avoidable accident. Unfortunately, things happen on the road that will sometimes result in an accident and a fatality, no matter what. That is why driving is so dangerous.

Sleepydoc1 | 30 juin 2016

How could anyone or anything stop in time when a truck suddenly appears in front of you? The car had to be very close to the truck to go under it. How long is the trailer in front of you? How much distance do you need to stop?

jestah | 30 juin 2016

Horrible accident. The TSLA blog response is an interesting read.

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."

This sentence doesn't make sense to me.

siliconhammer | 30 juin 2016

The driver of the other vehicle turned across oncoming traffic, failing to do so in a safe manner which would avoid collision with approaching vehicles. It seems his turn was executed in a manner where he possibly expected the oncoming drivers to slow down for him. Whether the oncoming vehicle was being operated by AutoPilot or not this puts fault squarely on the other driver.

The AutoPilot technology available from Tesla Motors in their newer Model S and Model X vehicles is a option which must be chosen by the buyer. When the buyer chooses AutoPilot they are warned that it is a "beta" technology under development and the computer will collect data about its use and send it back to Tesla. I recently ordered a Model X, I am still waiting for delivery, but during my test drive I was given orientation regarding AutoPilot, I was told it is only for use on controlled access highways, a modern equavalent of cruise control, and while it can do more, there are things it can not do, like see red lights and green lights, so it should not be used where intersections or oncoming traffic are present. So the driver may have been using his AP for a purpose not intended. This does not mean AP should not improve, but in this case the fault may be on the Tesla owner for using AP in an unintended manner, just like if somebody tries to blow dry their hair in the shower. It's not the hair dryer's fault they were zapped, the user failed.

I hope cooler heads prevail in this case and the true root causes of the accident are recognized. At the same time Tesla should be tasked with taking the information learned and building a better product which either can be automatically disabled in certain contexts or improved to manage these situations better. Still, AutoPilot is not the cause, don't curtail this important technology and its development because of this.

elguapo | 30 juin 2016

@silicon Totally agree. That said, the bright sky comment doesn't make sense. The radar or ultrasonic sensors should've picked it up - regardless of whether AP was engaged, as a commission avoidance tool - I would think.
Either way, tragic and a reminder we need to always be aware and be thankful for each day we have.

elguapo | 30 juin 2016

Collision. Although I like to avoid commissions too!

PXChanel | 30 juin 2016

@rdalcanto I really liked your comment, "Autopilot is not self-driving...[it is just] a driving aid." This needs to be made clear to all owners/users, lerhaps they don't understand that nor that it is a Beta product. Personally, I did not purchase AP, since I would rather take the wait and see approach. Enjoying Model X without AP all the same. My condolences to Mr. Brown's family.

elguapo | 30 juin 2016

@PX DS is supposed to tell the "driver aid" piece to all owners at pick up and it's made very clear in the OM. True that we must all still be aware.

Interesting that AP has driven over 130MM miles. That's like driving around Earth at the equator over 16,000 times. I think!

siliconhammer | 30 juin 2016

@elguapo I do not see anything bright sky about my comment at all. The media seems fixated on blaming AP, I am simply pointing out there were possibly two sources of human error. Sure make the sensors better, improve collision avoidance, but see this case for what it is.

elguapo | 30 juin 2016

@silicon I am agreeing with you. I am just saying I don't understand why Tesla is saying the bright sky could cause an issue. Aside from the camera, what part of the system would not be affected negatively by a white surface on a bright sky?

Agree this is being blown way out of proportion. It is tragic, but the media will do anything to to slam TM. How many mainstream cars with collision avoidance were in fatal accidents? Sadly, probably more than one.

DarthB | 30 juin 2016

If you are relying on a technology that impact your safety, you better understand how it works. AP is not perfect in that it can only detects lanes through image recognition (for lack of a better technical word) and surrounding cars through front and sensors. I have seen enough YouTube videos that both can be faulty in various edge cases and less-than ideal conditions to not fully trust it. Shadows on the wall, sudden appearing objects (I think this is the case), or weird lanes layout could all fail AP.

Drive safe people and always use it with caution! Condolences to his family :(

r.hauser | 30 juin 2016

I think what Tesla was saying was that the camera could not differentiate the contrast of the truck trailer and the bright sky while the height of the trailer, ie. room under the trailer, prevented the other sensors, radar and sonic, from detecting the danger.

PXChanel | 30 juin 2016

My hypothesis: I think the Autopilot sensors might be directed to street level, and the majority of the body mass of the semi truck is higher than most vehicles, so perhaps the Autopilot sensors are not as sensitive to higher obstacles.

Triggerplz | 1 juillet 2016

He was a navy seal WOW. That's the elite of the elite.. RIP

borodinj | 1 juillet 2016

Definitely feel for his family.

The articles, as predicted, are completely off base, sensationalist, and are devoid of facts. As has been pointed out by countless others, AP is beta, it's only a driving aid, and it requires driver involvement -- precisely because of situations like this where a truck turns in front of oncoming traffic, requiring intervention by either technology or a human to avoid it. To say self-driving technology caused the accident is flat out wrong.

Kutu | 1 juillet 2016

The victim of this accident had reportedly posted a video of AP preventing an accident a while back. Maybe he had a dashcam and it, or the memory card, survived the accident?

elguapo | 1 juillet 2016

@r.hauser I agree on sensors. Oftentimes when I am on highway, they don't pickup trucks except near wheels or cab because of the space below the trailer.

Very, very sad.

Gayatrikr | 1 juillet 2016

All bunch of crap idiots who r either out to get money more likely or stupid who doesn't understand how to use it i use it all the time and it warns u everytime u turn it on that u r responsible stupid nhtsa maybe wants payoff or kickbacks or plain politics r they now going to investigate every stupid accident of every other car ?

Gayatrikr | 1 juillet 2016

Hey nhtsa and all the other shorts get a life or post the $ you want and lets be done with
Done with prius and now on to tesla ?

RonaldA | 1 juillet 2016

An unfortunate loss of life on the road, I feel for the family and hope people will learn from the information that results from review of the data. Some accidents are unavoidable by human or computer due to time velocity and momentum. Others indicate a potential problem and a need for improvement. In either event my sincere condolences to the family.

eric.zucker | 1 juillet 2016

Tragic loss of life. I'm not at all convinced a human driver would have been able to avoid the collision any better.
I hope the NHTSA does a proper unbiased investigation, this can help us all.

Need to consider the road configuration, truck visibility and lights, truck driver license, health condition, fatigue and stress factors, phone or text usage, time of day, DUI, truck payload... Lots of parameters.

We Europeans have a requirement that trucks have side bars to prevent vehicles getting stuck under. This could have helped the AP radar detect the obstacle.

It won't bring back Joshua Brown. RIP.

Gert van Veen | 1 juillet 2016

My condolences to the family.

Gayatrikr | 1 juillet 2016

His tesla was named tessy i remember some postings with that name

Eric looking for consulting nhtsa its a stupid accident dont have to write a phd

aesculus | 1 juillet 2016

Dan Galves, Mobileye’s Chief Communications Officer, issued the following statement: “We have read the account of what happened in this case. Today’s collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.”

brando | 1 juillet 2016

Truck Side Guards, internet search helpful as usual to gain actual knowledge. Some benefits include
helping to keep people, bikes, cars etc. from going under the truck
improved air flow for fuel economy improvement
(possible additional storage like buses?)
increased detection area for camera, lidar, sonar etc.

Thanks Eric.zucker

borodinj | 1 juillet 2016

@aesculus - Great info from Mobileye. Thanks for posting.

jpboyerva | 1 juillet 2016

Aesculus thanks for the info from Mobileye. I have only had the car 2 weeks butt used autopilot on a trip to Greensboro. Although I was cautious probably not cautious enough. This tragic accident should add some reality to my otherwise very nice ride. Hope his death leads to improvements for all who will follow him behind the wheel.

Gert van Veen | 1 juillet 2016

This is a terrible accident. What possibilities are their to prevent this?
From the NHTSA website.

NHTSA defines vehicle automation as having five levels:
No-Automation (Level 0): The driver is in complete and sole control of the primary vehicle controls – brake, steering, throttle, and motive power – at all times.
Function-specific Automation (Level 1): Automation at this level involves one or more specific control functions. Examples include electronic stability control or pre-charged brakes, where the vehicle automatically assists with braking to enable the driver to regain control of the vehicle or stop faster than possible by acting alone.
Combined Function Automation (Level 2): This level involves automation of at least two primary control functions designed to work in unison to relieve the driver of control of those functions. An example of combined functions enabling a Level 2 system is adaptive cruise control in combination with lane centering.
Limited Self-Driving Automation (Level 3): Vehicles at this level of automation enable the driver to cede full control of all safety-critical functions under certain traffic or environmental conditions and in those conditions to rely heavily on the vehicle to monitor for changes in those conditions requiring transition back to driver control. The driver is expected to be available for occasional control, but with sufficiently comfortable transition time. The Google car is an example of limited self-driving automation.
Full Self-Driving Automation (Level 4): The vehicle is designed to perform all safety-critical driving functions and monitor roadway conditions for an entire trip. Such a design anticipates that the driver will provide destination or navigation input, but is not expected to be available for control at any time during the trip. This includes both occupied and unoccupied vehicles.

Do I understand that the current AP is functioning at level 2.
I wonder if this is not always a dangerous situation. The car is doing a lot very well by it self, so there is no constant urge to pay attention. But at the other side, if something goes wrong there is not much time to react.
At level 1 the driver has to do everything and there is no possibility for letting the concentrating down.
At level 3 the driver can pay less attention because the car warns early enough to react.
Level 2 is the worst of both worlds, no urge for attention, no time to react.

Should it be possible to have a car operating at level 2?

aesculus | 1 juillet 2016

Def at level 2. Alpha at Level 3 at best.

brando | 1 juillet 2016

Solution is Truck Side Guards, right?

Or truck driver not driving across oncoming lane of traffic? driver training?

Or divided road way so Truck driver can't easily block oncoming traffic.


rossRallen | 1 juillet 2016

Agree with @siliconhammer's comment, and I am sorry that Mr. Brown perished in the accident. Some accounts say that if it had been a frontal impact instead of driving under the trailer, Tesla's safety systems and design for crash safety could have protected him.

As usual. the ignorati in the press are in a feeding frenzy over a Tesla accident. Some of the wording used is highly prejudicial towards Tesla.

From what I have read, this was entirely the fault of the truck driver's illegal maneuver and it happened so fast that neither the Tesla driver nor the AP/emergency braking system could not react.

The LTAP functionality, mentioned in the Mobileye quote above, will add an important, missing element of sensing and safety. (Can I get it added to my Model X, or do I have to get a new car?)

Curious is the comment in some articles that the Tesla driver was watching a movie at the time. That came from the truck driver, who if observant enough to notice that, should have seen the Tesla before turning into it.

I use AP carefully and with hands-on. The TACC is the most useful and dependable feature, although it often reacts by slowing down too much when the car ahead moves out of the lane to turn off the road.

And, I've never used the automatic lane change feature and I doubt if I ever will activate it. Coming home from a Tesla SC, a white Model S with dealer plates (a loaner or on a test drive?) nearly plowed into my X making an abrupt lane change into my lane. I've always wondered if that was just carelessness or the auto lane change feature not detecting me. Rear right-quarter visibility in the S isn't great. My reaction was to swerve right to avoid the collision, and with good luck there was no one in that lane.

I don't know if it's just my sensitivity driving around in an expensive car, but it seems that there is more risky and stupid and unsafe behavior on the roads than ever before. I'm certain some of the things I've seen are due to texting and smartphone use by the driver. That's a factor we didn't have to contend with 10 years ago, so maybe it's not just me.

rdalcanto | 1 juillet 2016

Based on another report, he passed a car that was going 85mph pretty quickly. I don't remember what the upper speed limit of AP is, but he was going very fast. If he was close to 100mph, and the truck turned in front of him, there is probably not enough range in the radar and camera to make any life saving corrections when there is the equivalent of a stationary object of the road.

rdalcanto | 1 juillet 2016

object in the middle of the road.

Remnant | 1 juillet 2016

@ PXChanel (June 30, 2016)

<< I think the Autopilot sensors might be directed to street level, and the majority of the body mass of the semi truck is higher than most vehicles, so perhaps the Autopilot sensors are not as sensitive to higher obstacles. >>

That's a very sensible explanation, quite consistent with Tesla's description of the event:

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."

It is obvious, at this time, Tesla's sensors do not detect obstacles higher than that "high ride height of the trailer", but how high should it go? How about low flying aircraft, such as some police, traffic, or press helicopters? You wouldn't want the AP to steer your Tesla into a ditch because of them, would you. And how about marking the frames of the long-bed trailers in a way distinctive from the sky, in the daytime and at night?

Tesla does warn us of the need to maintain surveillance of the AP performance and the inherent approximation caused by poor or faded road signage, so there's no excuse for failing to follow its advice.

To have full AP driving we might have to have special AP roads, while in the meantime we need to preserve the intervention-ready stance recommended by the AP manufacturers.


Triggerplz | 1 juillet 2016

The Map of the accident....

lilbean | 1 juillet 2016
Triggerplz | 1 juillet 2016

@lilbean nice and they showed him in the video he made about auto pilot, how sad.. I have great respect for our Navy Seals

lilbean | 1 juillet 2016

@triggerplz- I have tremendous respect for them too. They are super-humans.

Joyceylkwok | 1 juillet 2016

Hi I am new to this and I have just ordered the Model X with the auto pilot option. Do you know if the car can do an
emergency brake on its own with or without auto pilot?

Triggerplz | 1 juillet 2016

@lilbean I'm 6'4 220 lbs in good shape but I couldn't make it thru that navy seal training for a billion dollars.. I saw a documentary of their training I could hang for a few weeks then I'd be outta there, the stuff they gotta go thru is unreal, you right they are Super Humans

lilbean | 1 juillet 2016

@triggerplz- I saw that documentary too. It's amazing. I'm 5'1" and in great shape and I wouldn't last 5 minutes. Navy Seals are smart and strong mentally, physically and emotionally.

rossRallen | 1 juillet 2016

@joyce. There is an emergency, straight-on collision avoidance feature independent of AP. You can choose to disable it, but it is normally enabled.

rossRallen | 1 juillet 2016

Why isn't the press all over the Google self-driving car that's had 11 or more crashes and has a top speed of, what?, 20 miles per hour?