Interesting article. Not really concerning considering what they had to do to fool the Autopilot.
Not to mention that's criminal act too.
This is why auto pilot should not be used in a construction zone, or other area where road marking are not well done.
@Yodrak: That's why I always thought that when the front facing camera sees the big orange construction sign, AP will turn off automatically and the driver needs to take control (audio and visual alert too). Also, that orange construction sign that is posted about one mile prior to beginning of construction zone will have some chip in it that tells Tesla mother ship to disable AP/FSD on the Tesla vehicle as it approaches HAZARD area, or otherwise known as construction zone. At the minimum, reduce the speed of the vehicle to say...45MPH so driver can adjust to unforeseen hazards. Some Tesla's might not have hit the Jersey barrier head-on, or carreened off into another's path.
So we keep on talking about FSD and how each vehicle can react to the existing conditions unilateraly. When will we also incorporate vehicle-to-vehicle (V2V) communications to let each of the vehicles in close proximity to each other interact?? This was all the rage about 10 years ago before autonomous vehicles came into vouge. All vehicles (V1) need to relay there intended path forward to others (V2; V3:Vet.al) before that vehicle can make intelligent decisions to avoid collisions with the other vehicles on the road, sharing each vehicles future route to all concerned vehicles. Kind of like what we now do between human drivers with our eye contact and turn signals. That's another of my concerns, why??...why?? don't Mercedez and BMW's have turn signals, such a high end auto and the owners don't buy that option????
One day instead of, or in addition to, orange cones any construction activities need to be posted online and be accessable to navigation map services and vehicles on road. That should be pretty easy and straightforward thing to do.
A few weeks ago, I was driving on the Mass Pike and the car displayed an error message that said something like "Construction detected, disabling Navigate on autopilot" in a construction zone that has been there for a year.
I have not seen the error since, and the car is quite capable of making it through the area without issue (even though I always keep a grip on the wheel in that area)
So perhaps it has that capability of seeing construction areas, but does not do it on a reliable basis.
The other thing I have seen is that older lines that are still in existence when lanes used to shift are sometimes partially used and the car will sometimes attempt to follow them but will see the other lane marking and will jerk back into the correct lane.
But the whole article deserves a "well duh!" rating in my book. Yes, the car will follow lines. The issue was fixed in a later software release, and the auto wipers coming on is certainly not a 'show stopper' in anyone's book.
Human drivers use a hierarchy of rules when they drive combined with local knowledge of the road .. this is why a supercomputer is needed for FSD. Its something like .. Rule 1 .. don’t hit anything hard. Rule 2 .. stay on the road ... Rule 3 ... stay in your lane ... etc. In the hacking case described, the car didn’t have the ‘brain-power’ to recognize the ‘real’ lane. As the supercomputer’s neural net gains experience, this type of issue will go away.
"the whole article deserves a "well duh!" "
"Human drivers use a hierarchy of rules when they drive ..."
But not all use the same hierarchy.
I remember this issue appeared a long time ago. Back when the Coyote repainted lane markings into a boulder that had a black tunnel entrance painted on it. While the Roadrunner did follow the lane markings it actually disappeared INTO the tunnel. When the Coyote tried to follow he hit the boulder that had the tunnel entrance painted on.,
So maybe name your car ‘Roadrunner’ so this isn’t a problem for you?
@reed "A few weeks ago, I was driving on the Mass Pike and the car displayed an error message that said something like "Construction detected, disabling Navigate on autopilot" in a construction zone that has been there for a year."
I've seen that message, and one warning about bad weather ahead, too. But I've seen neither for a while (maybe since the first release of NoA?).
I get the bad weather Navigate on Auto Pilot Disabled message whenever it is raining heavily. That happened this morning here in Mass because it was raining quite a lot.
It probably has to do with the radar not being able to see as well in the rain, or perhaps the side cameras have too much interference.
Sure it is a contrived attack on autopilot, but not uncommon giving the poor markings on many roads. Humans are still far better than computers at looking at images and figuring out where the lane really is.
I think we will see more and more of these "attacks". This is probably just the beginning. It will be a cat and mouse game for quite a while.
You understand it is just a lab simulation don't you? If someone does this in the real world it would be vandalism or manslaughter/murder if it caused accident and fatality. Nothing different than throwing rocks from an overpass at cars?
And more importantly, from the article....
"The flaws are in version 2018.6.1 of the car's firmware and have been fixed in a subsequent release of the software (2018.24)."
So the issue has been fixed anyways. It is a NON-ISSUE! That is the best thing about Tesla. If there is a software issue, it will be fixed. Remember the braking issues with the Model 3? Tesla fixed that using a software update. Try and get that done with pretty much any other car!