Forums

How is Tesla learning when we're not supposed to use Autopilot on streets?

How is Tesla learning when we're not supposed to use Autopilot on streets?

I watched today's presentation on the current state of FSD and I was really impressed. As I understand it, Tesla is learning from millions of miles driven with AutoPilot.

Since we're not supposed to use AutoPilot on non-highway situations, how is Tesla gaining data for non-highway driving? Is my assumption wrong about learning from Autopilot and they learn whether it's on or not?

BTW, I'm not going to add FSD to my 3, but I am going to add it to my Y order.

Thanks,

bp | 23 april 2019

With all AP2+ S/3/X, Tesla runs the AP software in "shadow mode". Even if AP/NOAP isn't engaged, the AP software is still running in the background and then comparing what it would do vs. what the driver has done. This data is being sent back to the AP software group, which is evidently using this data to test out new AP capabilities and train it to drive at least as safely as a human.

This is happening even in the vehicles that don't have AP/EAP/FSD activated (which is probably why Tesla includes the hardware in every vehicle).

Tesla claims their fleet learning capability should allow them to more quickly get FSD working - not only on highways but on urban streets.

wiboater4 | 26 april 2019

And then there are those of us that understand that Tesla has to CYA and still use it but watching over it vigilantly in an effort to help get FSD here sooner. I think of it as an extra set of eyes in case I glance away for a second looking at the songs on the screen.

RJMIII | 26 april 2019

@wiboater4 - Actually, after watching the Autonomy Day presentations, I think that not using AP on sidestreets may be just as beneficial if not more beneficial to FSD development (although I'm sure Tesla appreciates and uses both).

The reason I say this is that we now know that the software is running in shadow mode at all times and when the human driver does something different than the software would have this is a learning opportunity. I think (my opinion) that imitation learning will be most valuable in off-highway driving conditions. The interaction of cars, people, animals, bikes, etc. is much more complex and fluid in these situations. So, imitating actual human behavior even when humans don't exactly follow the lanes, signs or rules will be more useful to the system.

blue adept | 26 april 2019

Or...They could simply scan some street maps into their database to draw from, afterall, there's no need to "learn" what is already known.

TeslaTap.com | 26 april 2019

@blue - It's not the map Tesla is learning, it's the edge cases and how humans handle them. Objects flying off cars, paper bag blowing across the roadway, ducks waddling across the road... The list is almost endless of strange cases that FSD needs to handle properly. Capturing those cases is really helpful.

jdeskins | 26 april 2019

It's basically learning to drive by logging our reactions in different scenarios. When we drive, we don't have the map memorized and drive from that - but rather from what we see. I drive through constructions zones almost daily in a temporary lane that's not on a map.

Each of us may have some influence with our daily driving habits in how FSD handles different situations. I think I heard in the Autonomy Day presentation that they do filter out bad drivers. :)

NKYTA | 26 april 2019

I hope “don’t hit geese, but aim for crows” is a future option. ;-) ;-)

blue adept | 27 april 2019

@TeslaTap.com

I understand and/or know what they're attempting to model (the dynamism of intrinsically unpredictable Human behavior), I was just speaking to the simplicity of the underlying metrics, which has been my point all throughout these conversations on the matter, i.e., mapping roads is one thing while 'mapping' Human behavior is an altogether other thing and nigh on impossible if not entirely unfeasible.

That's why I believe that, considering all of the associated complexities, fully autonomous driving on our surface and/or neighborhood streets would only be possible in a controlled environment, say, for example, the one depicted in the movie "I, Robot" (which was actually a subterranean network of highways as opposed to the elevated ones we're used to), because it would effectively circumvent the incalculable randomness of the Human factor.

So to @Robocheme's observation, it is my position that FSD capability should be reserved SOLELY for highway travel exclusively as surface street incidents could, very likely, prove ruinous for a company having such a feature readily available for neighborhood street navigation for those too damn lazy or consumed with what might be on their phones to pay attention to the road ahead of them and, possibly, someone's child running out in front of them after a ball or something.

TeslaTap.com | 28 april 2019

@blue - I think some equate FSD as having to be perfect, and that is a tall order, one I don't ever expect to occur. It only has to be better than humans. As to how much better is an interesting debate. Yes, there will be kids that run out between two cars and will be killed by human drivers and FSD cars alike. Not too much can be done about that.

With a kid running out, given enough advanced warning visually, the FSD may avoid harming that child every time, whereas a percentage of human drivers will either not react in time or not see the issue in time. In these cases FSD will save lives. I expect there will be many other cases too, and perhaps some cases where FSD works similar to a human driver. The key is in percentages. If FSD is 10 times better than human drivers overall, isn't that good enough?

Waymo has been testing in our local city streets for years now. There have been a few minor accidents, but I think all were caused by other drivers. Clearly FSD can be made to work. Tesla is using a different approach than Waymo, and until we see Tesla's solution, we don't know if it will be as good or better than Waymo.

purepwnage5000 | 29 april 2019

I'm curious to see how FSD handles things like 4 way stop signs and your average idiot driver on the road trying to merge in rush hour traffic once it is approved and released.

Yodrak. | 29 april 2019

"I think some equate FSD as having to be perfect, and that is a tall order, one I don't ever expect to occur."

I agree on both points. With regard to the first point, the issue is that the term is misleading and some people rightly insist that a half-full glass, or even a 90% full glass, is not a full glass.

"It only has to be better than humans. As to how much better is an interesting debate."

Yes, a very interesting debate. As we've seen with people killing themselves when mis-using autopilot, and with a few battery fires getting more attention than much more common gasoline fires, better is not always good enough for many people.

TeslaTap.com | 29 april 2019

A bit macabre, but thinking a bit more on the kid that runs out senario. Let's say it occurs so suddenly no human or FSD could avoid hitting the poor kid. Now the human would likely be up on charges of involuntary manslaughter, and it may be difficult or impossible to prove it was not avoidable. FSD on the other hand has all the data and video to prove there was no other action possible in the given amount of time and it becomes just a sad accident.

Yodrak. | 29 april 2019

"Now the human would likely be up on charges of involuntary manslaughter, and it may be difficult or impossible to prove it was not avoidable."

Agree

"FSD on the other hand has all the data and video to prove there was no other action possible in the given amount of time and it becomes just a sad accident."

Maybe Mr. Spock would agree, but a good lawyer and some 'expert' witnesses can make a case in a civil suit that the FSD failed and perhaps a human driver might somehow have been able to avoid the accident. Even if the jury or a judge don't buy it certain segments of the press and certain segments of the population will.

sschaem | 29 april 2019

Its evident that self driving vehicle will be put at a much higher standard in an accident event.

People have accepted that human drivers can and will kill other from bad driving.
In the US alone, we have over 2 million people injured a year from bad driving, many very seriously. and over 35,000 dies.

We dont realize it, but we have a false sense of security because we believe the other drivers are like us...

Tesla FSD will most likely never reach a rally driver skills, but its already better then a decent portion of the pollution on the highway.

The test that would be interesting... can a Tesla with FSD pass the California road driving test :)
(from what I see, the beta version would ace it already)

nothotpocket | 30 april 2019

They should be able to handle everything - as long as the system has seen enough of it before. I was glad to hear that Tesla is as concerned with those 'edge' cases as they should be. If the risk hasn't been logged, the system can't handle it. A ball that has rolled across the street is a flag for an alert human driver that a kid might follow it. Will this data get captured by the fleet, that is, does it happen regularly enough and when/if it does happen, are we responding appropriately so the system learns? Can the system generalize an alert Tesla driver responding to a soccer ball and use the same logic when a remote control car does the same thing? It's all about the data, it seems.

blue adept | 2 maj 2019

@TeslaTap.com

I'm not saying that Tesla shouldn't use it or that it wouldn't be beneficial, I'm just saying that they should proceed cautiously and not allow themselves to be coerced or conned into the early deployment of the feature by other automakers/drivers until it has received extensive real world testing by appropriately trained and alert personnel so they (Tesla) don't risk ruining themselves, so-called "re-insurers" or not.

Tesla catches enough flack as it is and while the majority of it might be unwarranted made up/speculative BS, the point is that the unrelenting negative publicity serves to keep Tesla in the spotlight of public scrutiny and people love a good crucifixion.

We've already got people running themselves into tractor trailers and concrete lane dividers from the apparent misuse of/over reliance on the "Autopilot" feature as it is, one can only imagine the extent of the carnage that FSD would entail in the hands of today's all too often easily distracted drivers, or just those looking to drive a nail into Tesla's coffin.

blue adept | 2 maj 2019

@nothotpocket

>>> "They should be able to handle everything - as long as the system has seen enough of it before."

It would necessarily have to be a bit more involved than that, specifically, I do not think it possible to squeeze enough sensors or computing power into a package streamlined enough to deftly fit itself within the existing framework of the Tesla chassis, let alone compile enough data, that would enable a Tesla to safely drive itself around our streets.

Granted, Tesla has moved chip development and/or manufacture 'in-house', preferring to forge their own inroads into the world of processor manufacture (wisely I might add given the current cyber landscape...someone is clearly listening, thankfully), as opposed to having to rely on someone else to do it for them:

https://www.wired.com/story/musk-says-tesla-is-building-its-own-chip-for...
https://www.wired.com/story/teslas-new-chip-holds-key-full-self-driving/

All the same I'm unaware of the 'bench'-tested specifics of the chip's design or, more importantly, functionality and/or computational power other than what has been mentioned, but it would still lack the amount and sort of real world telemetry required for a vehicle to safely navigate the dynamism of the ever changing urban, suburban, even rural landscapes of our commuter world, a feat that mere data compilation of statistical scenarios alone aren't suited for.

Rather, there would necessarily need be sensors (complete with built in redundancies) placed all along the breadth and length of all commuter roadways in predetermined, measured increments befitting the sensor's range capabilities that possessed the agility of both monitoring and transmitting, in real time, the ever changing environments of our city streets and pedestrian walkways, which goes back to what I've been saying all along about the need for a large infrastructural investment to make FSD a feasible reality with marginal risk befitting the most aware and reactive drivers because, no matter who you are or even however great of an innovation something might be, it's benefit pales in comparison to the loss of someone's child and/or loved one, especially a loss that could be perceived as preventable but for a 'computational error'.

Just saying.

blue adept | 2 maj 2019

@sschaem

>>> "Its evident that self driving vehicle will be put at a much higher standard in an accident event."

Exactly!

Imagine the stigma of 'calculated indifference' that would arise once the 'human factor' were removed should an accident resulting in the death of someone's child occur.