Forums

17.18.25 for us AP1, anyone yet?

17.18.25 for us AP1, anyone yet?

I see a new version for ap1 is rolling out. Anyone have it and is there anything interesting?

Millionmilesorbust | 11/05/2017

I just downloaded and installed it. Can go check the car soon and will post the update notes.

Millionmilesorbust | 11/05/2017

"with this release, we're continuing to improve your model S experience with software fixes and improvements"

JAD | 11/05/2017

Ok, thanks.

georgehawley.fl.us | 11/05/2017

I m skeptical about driver assistance feature software that was written by Mobileye being improved by Tesla. AP1 TACC and Autosteering could benefit from some improvements but probably work well enough to be useful for alert drivers and not very dangerous. Tesla SW engineers are probably focusing on AP2 Teslavision features in preference to reverse engineering Mobileye code. But, if Tesla actually does this reverse engineering and improves AP1 features, I will be delighted.

Haggy | 12/05/2017

MobilEye might tell the car things such as which road signs it sees, whether it sees lane markings, etc, but ultimately it's Tesla's software that pulls everything together and has the car do things with the information, including radar and sonar. If it were merely MobilEye, then we wouldn't have seen much change since AP1 first appeared, at which point the hardware did almost nothing. It did read speed limit signs, and that car made use of proximity (ie parking sensors) but the latter likely had nothing to do with the MobilEye chip, which is physically in the camera housing.

Tesla now has to interpret what comes out of the camera before it decides what to do with the information, and it's theoretically possible for the car to be limited to only the information that the camera's associated chip exposes, while the newer cars could be programmed to recognize something specific that a camera could see that the EyeQ3 chip doesn't report on.

I don't think that will be enough to keep the car from evolving, but the car has limitations due to the overall hardware anyway. It might be that some things that the chip can do, such as read stop signs and traffic lights, might not be good enough for Tesla to incorporate while it might have been theoretically possible for them to do it with their own processing of the camera output. But it's more likely that Tesla wanted to use multiple cameras as a solution, not better processing of the existing camera.