Model X

Dangerous Autopilot

edited November -1 in Model X
Several weeks ago I was driving on a dual carriageway A13 with the autopilot on set at 70mph (Limit) and for no reason the car slammed on the brakes, the car behind had to swerve to avoid the rear of my car and was understandably upset. Thing was nothing was in front of my car in any lane and there was no warning given. Reported to Tesla service immediately, find no fault.

So a week after this, driving again, same road, different direction, but this time in a jam, car comes to a halt, no problem, car in front pulls out of the lane, my car accelerates hard at the truck in front, manage to hit the brakes hard enough to make the wheels lock up, but avoid the collision, again no warning.

Finally 6 weeks ago, driving in traffic on a single lane road, again auto pilot on, to take the strain out of traffic, this time for no reason car spins the wheel left, after which it then says something like auto steering disengaged. If anyone had been on the left, the car would have taken them out.

So after each event I speak to Tesla
First response that shouldn't have happened, we will investigate, you may need to reset the car - Never heard another word

Second response that shouldn't have happened we will investigate - Never heard another word

Third episode, that shouldn't have happened we would recommend you don't use it till we find out what went wrong, 5 weeks later I get a call saying the auto steer disengaged, my response, I know it did, after doing a sharp left turn. Response well you shouldn't be using it in traffic or any other road than a motorway or freeway, but we will look into it.

The last contact was two weeks ago, meanwhile I don't use auto pilot and the screen they were ordering in to replace my faulty screen 7 weeks ago still no mention.

Way to go Tesla


  • edited November 2018
    Yes, I've experienced all of the same but I will say mine has gotten better over time. Not sure why though. I have fewer phantom braking incidents and now very few jerks left. I basically only truly trust AP on the toll lane going straight on the interstate.
  • edited November 2018
    Never had the acceleration issue - Did you have TACC enabled (minus the auto-steer?)

    Perhaps you could have service center check your front sensors?
  • edited November 2018

    I hope you are aware that you are using a beta product and you need to read its manual to see what are its "limitations" which is a euphemism for accidents/deaths if a driver is not ready to do a correction timely.

    1) Phantom brakes: That's the function expected from radar. It's just like why the heater is hot and a freezer is freesing and why the radar keeps producing phantom brakes. That's its nature so you need to be ready to press accelerator as needed

    2) Crashing into stationary objects: That has been well documented in Florida and Mountain View fatal Autopilot accidents. That's another function of working with radar: You need to expect that radar algorithm can allow crashing into stationary objects.

    3) Wrong steering: That is very much expected and that is why the manual says Autosteer is a hands-on feature. At least one of your hands must be at the steering wheel in order to correct the steering timely. If your hand is not on it with a constant light counter-torque, you might not realize the undesirable steering and by the time you get your hands back to the steering wheel, it might be too late.

    4) Tesla's responses: Usually, Tesla would say something generic such as the system performed as expected.

    It is just like when you ask a pediatrician why your baby can only utter the word "dada" but not "mama" and people might get angry when the answer is nothing wrong with the baby at all!

    There's nothing wrong with the Autopilot system because it is still in beta just like a baby who can only says dada and not mama.

    Just give it time, Autopilot will progress to Full Self Driving and you won't need to place any hands on the steering wheel anymore.
  • edited November 2018
    Druck me funk !
  • edited November -1
    Let me say that only the OP knows what really happened. But, in my experience, Autopilot doesn't do drastic things like, "...slam on the brakes..." or "...spins the wheel...". Those are IMO hyperbole... When AP apples the brakes inappropriately, it tends to 'brake check' when not expected. And when traveling at a good clip, this can give the driver a mini-adrenalin dump that's not particularly pleasant. Also, when it steers badly, it tends to act like a confused student driver -- slows down and seems to be in a paralytic quandary so you are forced to take control... It is the human that remembers the panic of the moment, but sometimes not the unbiased fact(s).
  • edited November 2018
    I too have had incidents of "phantom braking" on the highway, usually in sunny conditions when there is a very dark and pronounced shadow on the road under an overpass, but it has happened a couple of times when there was no apparent obstruction or visual mirage that the cameras might mistake for a stationary obstruction in front of the car. I also have had faulty steering occur when using Nav on Autopilot and entering an exit ramp where lane lines disappear. These "faults" of the Autopilot/Driver Assist System are because of it being a Beta software release and it is our job as drivers to know and anticipate these shortcomings and take over control of the car until refinements come along in the future. I am careful not to use Autopilot/Auto Steer or Navigate on Autopilot unless I am on a road with little other traffic nearby me; I kick it Off when the road is clogged with traffic.
  • edited November 2018
    I've had AP slam on the brakes on I5 three times now randomly. Nothing on the road, perfect weather, no cars around me. Not good. If someone had been behind me it could have very easily caused an accident.
  • edited November 2018
    I've had the same issue several times. But mostly I'm posting just to say I'm so tired of people explaining away the software as beta. If it doesn't work why are we paying for it?
  • edited November -1

    It's just like a toddler who is hardly able to stand, walk, and run. Without supervision, a toddler might fall down the stairs and the babysitter could be in trouble (despite of the effort blaming on the toddler).

    If a baby is unable to stand, walk, and run, why are there so many people who love to take care of a baby?

    That's pretty much the same way with Tesla Autopilot / Full Self Driving.

    The technology is still in its very infancy stage.

    Phantom brake is not unique to Tesla. The article below cited Uber Autonomous Program that wanted to show its boss that Phantom brake is solved by disabling the automatic brake system for a smoother ride for its boss and guess what?
  • bpbp
    edited November 2018
    After the first well publicized AP accident (which may have started the divorce from Mobileye), it's possible the software is designed to over-react to possible road hazards, causing phantom braking - often with light-to-dark changes in the roadway ahead caused by dark pavement or shadows from objects overhead.

    This is a safety issue - especially when it happens in heavy high speed traffic on highway and the vehicle behind doesn't expect you to slow quickly for no apparent reason.

    This is beta software - and will not always make the correct decisions, just like a new human driver.

    Also, features like NOAP, Auto Lane Change, and AutoSteer (lane keeping) really work best on limited access highways, where the road conditions are relatively simple. Other manufacturers limit this class of features to only limited access highways. So far, Tesla has allowed AutoSteer on other roads, even though the software may encounter situations that it isn't prepared to handle.
  • edited November 2018
    It's unpleasant when these AP mistakes happen, but there are a few people who have really high expectations of beta software. In reality, sometimes AP feels like alpha software, but the bottom line is it's NOT ready to be error-free.

    AI = decisionmaking based on inputs. In a perfect world, what you want is:
    100% correct decisions (obstacle => avoidance maneuver, no obstacle => no action)
    0% incorrect decisions: false obstacles => avoidance, true obstacle => no action

    In the real world you hope for >99.9% correct decisions, <0.1% incorrect decisions. To achieve that, you need more accurate detection and smarter algorithms. Tesla is responsible for both #1 and #2, we're all contributing to #2.
  • edited November 2018
    The "beta software" excuse is wearing thin. I paid $5000.00 for autopilot 2 years ago. Why does it still scare the heck out of me from time to time?
  • edited November 2018
    @Model X Guy

    It's the nature of this particular technology. Consumers might misunderstand that these problems are easily solvable but it is nowhere near that currently.

    Waymo/Google Autonomous has been trying to solve these problems since 2009 and it still does not allow any consumer to use its product.

    Commercial Airline Autopilot still can crash onto a mountain, to the ground short of a runway, or recently, can nose dive in less than 1 minute in order to crash into the sea.

    That despite a promise that the technology would be here next year, then when the next year is here, it'll be next year again.

    And that is not an expectation for "2 years ago". Commercial Airline Autopilot has been around since before World War II or 100 years ago. 2 years of Tesla's time is just tiny comparing to the history of fixing this automation puzzle.
  • edited November 2018
    I posted this on another web site, for the record:

    I was driving south on i5 in Washington state two weeks ago going 70-75mph on autopilot with navigation on and suddenly the brakes locked up and the car was coming to a sudden stop. I hit the brakes my self and then the accelerator as quick as I could. I looked in my rear view mirror and luckily there was nobody following me. Wow that was close. So I told the guys at my service center in Portland Oregon about what happened and he said “ya we have heard that happens at times”. I asked him if I had been rear ended would Tesla stand behind the accident? He said no that the autopilot with navigation is beta status. He said I would have to give him the exact location on i5 and he would pass it onto engineering and they would have to go in and change the programming in the system to tell the cars not to jam the brakes at that location. That did not sound very reassuring to me.

    I also notified Tesla corp from my Tesla account and they responded with a boiler plate statement that was not very satisfying. I love my car and its the best automobile I have ever had. I was counting the other day and I have owned more than 50 cars so far and this is the best. However If I were to be killed in an accident because of some programming error that would be tragic. I love the autopilot and have driven 14,000 miles in ten months using a lot of autopilot. This is the second time the car did this brake locking up situation. The first time it was just for a split second and then then car recovered and accelerated on its own.

    So I think I know why the car is doing this. I would like to advise others to be aware of this situation. I believe the car thinks its running into a wall when this braking accrues. What I believe is happening is when I am driving on a downhill interstate with an overpass in the middle of the hill the car sees the dark background of the hillside and when the overpass comes into play is when the car gets close to the overpass and the car thinks because its a solid straight line on the horizon / overpass it thinks there is a wall in front of the car and slams the brakes on.

    Has anybody else had this happen?

    Thank you

    I am glad to hear I am not the only one having this problem...kinda like beating a dead horse here...
  • edited November -1

    Yes, there are at least 2 documented Autopilot fatalities already.

    If you are unable to handle Autopilot then you should not use it. You just need to wait for a final production quality before using it.

    It's the same as if you don't know how to babysit a baby, then don't invite it over. Wait until the baby grows up into an adult then you can invite that person over.

    To understand why phantom brakes still happen, please read:

    Phantom brakes and crashing into stationary objects will continue to happen until someone is smart enough to solve it!
  • edited November 2018
    Thanks for the link. So after reading the above article, I would guess after a few fatalities and some court cases we will have our autopilot turned off by a judge. Bummer, I hope this does not happen. You know we will likely see an lawsuit filed against Tesla and we will all get a $5,000 for purchasing autopilot. I can see some lawyers playing that card. This will not be good for Tesla, a 1 billion dollar payout. Not good for my Tesla stock...I love Elon he is an excellent visionary however at times I think he extends himself with his promises.
  • edited November 2018
    The only way autopilot gets yanked is if no one is allowed to drive at all, and we go back to the horse and buggy.

    Or walking.

    Autopilot isn’t perfect, but it’s better than the majority of human drivers. I don’t see that changing.
  • edited November 2018

    I don't think Autopilot is in any danger of being shut down.

    It's a tool just like a knife. If I hold a knife the wrong way, I might lose my fingers. A tool is perfectly safe as long as a user uses it as directed.

    The usefulness of a tool outweighs the death and injuries from the abuse of a tool.

    There have been a few lawsuits:

    1) Sudden acceleration: Class Action plaintiffs all dropped out except for CA owner and son (individual case is much cheaper for Tesla than a Class Action.

    2) China streetsweeper Autopilot death: Still pending because the father does not allow Tesla to access car's log and Tesla already provided description keys for them to decipher the logs on their own.

    3) Late Autopilot Release: Class Action lawsuit complained that AP2 release was too slow. It's settled and most likely owners would get about $200 for the settlement.

    4) Mountain View Apple Engineer Autopilot death: Family expressed a desire to file a lawsuit. It's still not filed for the past 8 months despite of preliminary NTSB findings (No Driver Input detected leading to the fatal crash).

    In commercial airline Autopilot, human pilots were always blamed even when there were clear evidence that Autopilot was the beginning cause of a crash.

    Although the fatal Florida Autopilot death investigation blamed that Autopilot without strict warnings/restricions/geofencing system contributed to the crash but it ultimately blamed the driver for overreliance on technology.
  • edited November 2018
    Tam, That is great information, that makes me feel better about the court cases... I guess if Tesla gives enough warnings we should be OK.

    I am leaving for Palm Springs on Tuesday 1200 mile trip and I will be using the auto pilot most of the drive.

    Thanks Again
  • this slamming on brakes and generally acting like a thoroughbred racehorse (ie no gas- full gas-no gas) behaviour is driving me nuts! It didn't used to be this way, this is new behaviour from around may timeframe - don't know the exact SW version but right now the AP is just about unuseable because I have no idea what it will do.

    Please fix this Tesla!!!
  • edited June 2019
    The Beta excuse is just that, an excuse. I always drive with my hands on the wheel and assume the car might over or under react. Generally, I am pleased with its operation and it does make trips more relaxed for me. With that said, the sudden hard braking for no apparent reason is a danger to anybody following me. I can’t predict it and I can’t slam on the throttle in time to rectify the initial slowdown. This particular issue must be solved quickly before a Tesla causes a rear end accident all by itself.
  • edited June 2019
    If someone is following too closely & AP brakes hard & they slam into your rear end....take a wild guess who is at fault? isn't Tesla or AP
  • edited November -1
    I have used AP in my MX for over a year now in my regular 4 hour commutes to London for work. And remember like others have already stated, it’s still BETA.

    Yeah it has happened to me a few times, it’s never slammed on hard enough to bring me to a stop, but if it phantom brakes, I then just step on the accelerator and it’s fine.

    Yep it’s frustrating and I hope Tesla fix it soon but it’s still in the testing period, if you feel uncomfortable using it then don’t use it.

    Got to say, 99% of the time AP works amazingly well, you just have to still stay alert but it’s made my long highway/motorway commutes way more enjoyable.

    One thing I have noticed with the latest update that I ain’t a fan of, (UK) now the MX when AP is engaged doesn’t seem to want to undertake. Obviously I know you shouldn’t undertake but if I am in the 3rd lane for eg then a car starts to overtake me in the 4th lane, then that car has to slow down for some reason, I.e the 4th lane is busy, then my car will also slow down even though nothing in my lane, hence me saying it won’t undertake anymore.
    Anyone in UK experienced this please?
  • edited September 2019
    When people realize that autopilot is math and if/then statements in code and not intelligence, they’ll stop having high expectations. AP does not know what ‘things’ are; it doesn’t know that is a truck; doesn’t ‘know’ that is a shadow. It is a high speed pattern matcher attached to servos.
    Now think about how much trust you have to take your hands off the wheel...
Sign In or Register to comment.