Forums

Accelerater-brake confusion --- accident in Japan

Accelerater-brake confusion --- accident in Japan

http://t.co/okIw7Enod7

http://t.co/nf3gRMfQRh

It seems that Madel S does not have a function to avoid this type of accident.
I wish Tesla will release a software update to add a function which automatically stops the car when a driver quickly steps on the accelerater while the sensor finds a certain object in front of the car.

Roamer@AZ USA | 21. august 2015

ouch.

Tropopause | 21. august 2015

Do ICE cars have the safety feature you mentioned?

prp | 22. august 2015

Bmw certainly don't, those things have a habit of accelerating when you press the brake!
Also I accelerate just before I overtake someone on the highway, so don't really think the OP's suggestion is workable.

ihara | 22. august 2015

Additional required condition would be the initial speed is zero.

ihara | 22. august 2015

Actually, there are some Japanese cars having this function, even in the small car category, called "KEI". Why not by Tesla.

Ankit Mishra | 22. august 2015

Japanese cars, Danish cars, Norweigian cars, Australian cars etc all have unique functions. If people from all countries demand Tesla to add small specific features Tesla will have to burn a lot of resources. I assume you must have known Tesla produces its cars in USA so all features will be USA specific.

ihara | 22. august 2015

@ankitmishra U.S. brilliant persons would not make this type of error. Perhaps, old age people do not drive in U.S. But, my point is that avoidance of human error is always important and Tesla is capable enough to easily develop and add this function.

ihara | 22. august 2015

The age of this Japanese person is 65. I consider 65 is still young but made human error.

Ankit Mishra | 22. august 2015

Tesla is already overexerting itself. It is already doing more than what it is capable of. I understand your point, you are asking Tesla because it is the only company that listens. It cant add country specific functions as it would need diversion of resources which it doesn't have. I hope Tesla listens to you and adds the function (when it can) to all the new cars.

akikiki | 22. august 2015

American cars already have this function. It more common name across all manufacturer is called "the driver".

ihara | 22. august 2015

@ankutnishra Thanks. I do not wish to bother Tesla. However, you may remember how was the U.S. news about the battery "burning" accident in U.S. Elon Mask made a special comments about that accident, including titanium cover added. The recent accident in Japan did not spread widely but it is a quite good accident for the people who wish to make negative campaign against Tesla's autopilot attempts. For example, TV news with wrong wordings such as "Tesla's car having autopilot function made an accident.".

ihara | 22. august 2015

@akikiki OK! It is just like Japanese government' standpoint. Cars should be drove by human. I just thought Tesla can change the world.

Ankit Mishra | 22. august 2015

ihara plz staph.

tes-s | 22. august 2015

Shouldn't collision avoidance avoid this?

Mathew98 | 22. august 2015

One can be senile and confused even below age of 65. If this is a typical driver, he would have at least 30 years of driving experience. How do you confuse the two pedals with this much driving experience unless he is drunk or senile?

His licence should be revoked before he gets confused again!

Case in point, an idiot with a head full of white hair made an illegal U turn and cut in front of my parked minivan sharply and parked in front of it. He cut in too close and sideswiped the front fender and bumper of the driver side.

This was all captured in my surveillance cameras and on board dash camera.

When I confronted the driver, he denied being at fault and proclaimed that he had been driving for 35 years! My response was that "How many cars have you hit in those 35 years?"

Now my insurance company will get his insurance to foot the repair bill.

ihara | 22. august 2015

@tes-s.ct.us That is what I wish to know. This Model S may be the old version having no sensor. Or, the collision avoidance does not work under certain conditions like this. I am not sure.

ihara | 22. august 2015

Nobody was killed by this accident. But, there are miserable accidents every year; e.g., a child is accidentally killed by his/her grandpa who makes a mistake like this. So, I wish Tesla will add this function with not limited to the forward direction but to 360 deg.

Son of a Gunn | 22. august 2015

I have a suspicion that the OP is trolling.

Ankit Mishra | 22. august 2015

Unfortunately he is not. This is common in this forum.

ttaliesin | 22. august 2015

This thing is called human error. Even collision prevention would not help in such a situation, these safety features can always be overruled by the driver. In this case by stepping on the accelerator when the collision detection kicks in. I guess what I want to say is simply this: If you want to prevent a lot accidents like this happening, you need to ban humans from driving. And even than, some accidents are still going to happen. I see a different issue here: The driver made a mistake, some people think that it is the automakers fault not having better prevention systems in place. With this mentality we rather should stop driving around and stay at home or so. But I guess it's hard to take responsibility for mistakes made... so blaming somebody (the manufacturer in this case) is more convenient.

Madatgascar | 22. august 2015

This is a common enough error. My dad used to call it the "whiskey throttle." Tesla already does have automatic emergency braking, and it probably engaged and also sounded an alert, but it cannot be fast enough and strong enough to override a sudden stomp on the accelerator which must be interpreted as the driver's intention. The problem is, sometimes I might want to accelerate towards an object, such as to avoid a worse collision (being rear-ended).

Madatgascar | 22. august 2015

BTW I don't see the OP as "blaming" Tesla for this accident. It's more like "stop me before I kill somebody."

Madatgascar | 22. august 2015

Look at the photo of the back of the car - it looks like there are no sensors. This was probably a pre-sensor car. The newer Teslas do have automatic emergency braking.

I have had the experience of automatic braking kicking in and it did save me from a collision. However it is a very last-minute thing and it is really only designed to minimize the impact of unavoidable collisions. The audio alerts also helped me to react and the combination of automatic braking and human braking narrowly avoided an accident for me.

ttaliesin | 22. august 2015

I guess if you know that you need a "stop me before I kill somebody" feature, handing in your driver's license pro-actively is the more effective thing to do.

ihara | 22. august 2015

I do not intend "blaming". I just wish Tesla will provide this feature, which would be a part of the autopilot that Tesla will provide. There will be a legal issue as to the responsibility of accident. My position is that the responsibility is the driver's side, including whether he turns this feature on or off.

ihara | 22. august 2015

@t But such people do not stop driving, saying "I am still young enough and will never make such a mistake.".

EternalChampion | 22. august 2015

I don't speak for Tesla, but I can assure you that this is what they want to respond:

Accelerator = go

brake = stop

Never the twain shall meet.

Spend less time on the forum and more time learning basic driving skills. Have a nice day.

AmpedRealtor | 22. august 2015

So now Tesla is supposed to solve all of humanity's driving problems? Good grief!

JLC | 22. august 2015

@ihara, that sounds like a reasonable safety feature that Tesla could easily add to cars with the sensor package (assuming that it doesn't already exist in the collision avoidance feature). It's nice that Tesla's design is so amenable to upgraded features via the frequent software updates. With my old BMW, a request like that would mean buying a new car...

ihara | 22. august 2015

@JKC Thanks!

Haggy | 22. august 2015

I went car shopping this weekend (for and with somebody else) and the cars we looked at ranged in price from $30K to $70K. They all had safety features that Tesla lacks. While Tesla didn't announce more features, it would be impossible for them to implement anything that can let you summon the car without having something similar to pedestrian detection in other vehicles. They would have to have something that would allow the vehicle to stop when it detects a person or car behind it for self parking, so it would make sense that it would be there when a person backs out of a parking space. I can't say that they will add flashing arrows and beeping that would happen when they would in other cars, but I have to wonder whether this would have happened in a $35K Subaru.

Tâm | 22. august 2015

--------------
http://articles.latimes.com/2012/apr/13/business/la-fi-mo-bad-women-driv...

NHTSA estimates there are about 15 pedal misapplication crashes per month in the United States, and that the drivers in almost two-thirds of such crashes were women.

Drivers aged 16 to 20 and those 76 or older were most likely to be involved in pedal misapplication crashes.

“The single factor that may explain over-involvement in pedal misapplication crashes at both ends of the driver age distribution is poor executive function."
--------------

Tesla car will be able to be summoned and drives itself from your garage to your front door without crashes.

Thus, I believe Tesla will be able to mitigate the unfortunate pedal misapplication crashes very soon.

NKYTA | 22. august 2015

"They all had safety features that Tesla lacks."

Except in crash tests and actual real-world crashes, you mean?

I get your point, but that sentence jumped off the page as out of place...

mrspaghetti | 22. august 2015

While we're at it, we should put in a safety feature that protects against the driver turning left when he actually meant right.

Wow.

Tropopause | 22. august 2015

Even if Tesla only had one pedal, somebody would find a way to step on the wrong one!

Tesla, please remove all pedals from Model S. Switch to ESP. Wait... no! I give up.

EQC | 22. august 2015

The car may resist, but the driver should always be in ultimate control. How could the car know whether or not the driver really wants to plow through a wall?

What would be worse: a few stories of a driver accidentally plowing through a wall, or just one of the following hypothetical scenarios:

- "The bad guys had me and my family surrounded in our Tesla...the only way out was to floor it through a fence...but the car just refused to go!"

- "I spotted the escaped psychopath murderer through the restaurant window...he was holding a big remote control...there was no time, so I tried to floor my Tesla though the wall and end his plan. But the car would not budge. 3 seconds later, he pushed the button, and now half of Metropolis is gone!"

Tâm | 22. august 2015

@EQC

It costs money to have these life saving function.

So, in the first place, you need to open up your wallet and order the feature first.

Secondly, once you bought it, and if you don't like it, you can always turn it off so you can do all the things that you want.

ihara | 23. august 2015

@EQC The car can know your ultimate decision if you make positive instruction to change the default setting or if you just turn off this feature.

bish | 23. august 2015

There is a tombstone in Uniontown Pennsylvania that reads:

Here lies the body

Of Jonathon Blake

Stepped on the gas

Instead of the brake

The accident in Japan was not Tesla related.

jordanrichard | 23. august 2015

Driver error. Since we don't know what the driver's intentions were, we can't comment on what type of sensors and other nanny widgets would have prevented this.

SbMD | 23. august 2015

@ihara - If this was indeed a pedal misapplication error (i.e. the driver steps on the accelerator instead of the brake), then I doubt that there is a software or hardware patch that could be implemented at this time to safely and consistently solve the problem. It would require the car to have extremely accurate information about the conditions in and around it, correctly determine that the driver is doing something hazardous which requires intervention, and then override the driver's actions in a timely fashion. This type of decision making gets to the heart of self-driving cars, including where to draw the line between driver and the car taking over for the driver.

Simply put - if the car is wrong, it creates another set of problems and risk of accidents.

@Haggy - the 2012 NHTSA report found pedal misapplication occurred 38 times in Subarus, when they mined the data from North Carolina ( the only state at the time that had such information recorded) and media reports. Objectively from a safety assessment standpoint (among others), one should choose the Tesla over a Subaru.

ihara | 23. august 2015

@SbMD The news said so as the reason for the accident. There exits cars having this feature. In Japan, this feature is provided as an option to some models. So, I believe this feature is technically possible. No need to be perfect and reduction of severity of accident is meaningful.
I understand we would like to control the car. The correct image of this feature would be assistance of the driver, rather than taking over human control.
Finally, let me repeat that you can turn off this feature.

ihara | 23. august 2015

Additional comments. This fearure was added from version 3 of EYESIGHT and version 1 and version 2 do not have this feature. Also, EYESIGHT itself is an option to Subaru cars. I assume that Subaru cars is these 38 accidents did not have EYESIGHT Version 3.

http://sp.subaru.jp/eyesight/sp/function/index.html

Sorry, only in Japanese. I could not find out a corresponding English page.
I also found that this type of accident occurs about 7000 times per year in Japan.

Tropopause | 23. august 2015

I'm all for reducing accidents. However, nothing comes without a price. With greater ability for a car to override a driver, the less authority the driver will have.

Ability to turn OFF these safety features prevents them from doing their job, leading governments to decide whether to mandate said features.

Also, the more automation technology a car has, the more a human driver's skills will likely deteriorate, leading to the need for even more automation.

Either we drive our cars, or they drive us. Choose a side.

SbMD | 23. august 2015

@ihara - yes, aware of the Eyesight tech but there are limits to it such as camera visibility in bad weather, etc. Also, Nissan has had this technology since 2012 and only deployed in a limited fashion, if I understand this correctly. Still being tested, If I understand correctly. My point again is that Just because you have the hardware and software, it does not change the fact that it is still a complex problem.

7000 accidents in Japan is a number I saw as well. That appears much higher than the US although U.S. Numbers may be underestimated.

The first important task here is not how to change our cars but how to improve drivers. If you look at the literature as to why people make these errors you will understand the complexities better and would probably agree. Any car tech to help is simply an adjunct to safe driving (and some features of Subaru's eyesight highlight this) and still imperfect. Again, this specific problem embodies one of the complexities of self driving cars.

Not disagreeing with self driving cars nor safety measures but let's think about how we improve driving and reduce errors first.

ihara | 23. august 2015

@SbMB I fully agree with you in that we should improve drivers. But there are many elderly people who are confident enough to reject training and it would not be generally proper to force people to do something. It is my position that being not perfect cannot be a reason why we will not develop/introduce. We who love Tesla are not Ludditte, right?

By the way, EYESIGHT(R) by Subaru, Fujijyuko, is excellent in that quite reliable safety functions are enabled only by stereo cameras without using lader. On the other hand, it it true that there are limitations of the system. I am not so sure about Nissan but agree with you in that I have not seen great improvement of the system.

SbMD | 23. august 2015

Indeed, good points and I enjoy hearing the perspectives of people on this forum.
There does get to a point where elderly or inexperienced drivers (ones more at risk to have this happen to them) need to be either further training or removed from the road. This is a very sensitive area, and there are some U.S. laws which cover this in very specific situations. Having old driving parents and up-and-coming drivers, and having to sometimes recommend driving restrictions to people as part of my profession, I understand this on a number of levels.

We can agree, then, that EyeSight has inherent limitations on use and even has to incorporate logic for driver error lockouts. Hence It may be decent in some settings but still not an ideal solution. That's also why you can disable it. Otherwise it would always be "on".

All the companies interested in self driving cars are looking at the larger issue of when to intervene. They need time to do it right.

I wouldn't buy a Subaru for Eyesight, though. I wouldn't use their current tech as a crutch for poor driving or safeguard against accidents/driver error. It is a relatively narrow focus of utility. I also doubt that it would have helped the person whose accident you posted for this thread; they were probably going at a high velocity, judging by the pictures (running up at an incline and Galway into a storefront). Doubt EyeSight or any nascent solution available would have stopped them in time.

Just my opinion. Perhaps others would disagree, probably on the Subaru forum :)

Bottom line: If someone can't drive, they need to be off the road and not drive any car. There are other options in every country. There are no valid reasons nor technology to enable a dangerous driver on the road.

SbMD | 23. august 2015

Galway should be halfway.
Autocorrect: another example of tech that doesn't work perfectly...

ihara | 23. august 2015

There is another sensitive area relating to human rights; diseases represented by epilepsy. This month, at Ikebukuro of Tokyo, five persons were injured and one of them was killed by a non-stopping car. The driver has epilepsy and had been authorized to drive cars with the requirement to take medicine every day. According to the news, he forgot to take medicine on that day and lost consciousness. I wish that car had a function to stop when it sense a person.

inverts | 23. august 2015

This is a lose-lose situation for Tesla. Either lack of the "safety" feature causes the crash, or the auto-function causes some other crash. While there are some auto-features that are always good (e.g., anti-lock brakes), for others its a bit more iffy. Say you either hit a traffic cone head on, or you get nailed by a train sideways. I prefer the traffic cone, and don't want the MS to override my intention. Given that, I prefer more driver responsibility.

I remember the Prius-unintentional-accelleration story, that just turned out to be bad drivers. More personal responsibility is in order.

Pages