Forums

Elon - implement an IQ test for autopilot privileges

Elon - implement an IQ test for autopilot privileges

If anything is proven by autopilot accidents it’s that just because someone buys a Tesla doesn’t mean they’re smart enough to use autopilot. Most of what I’m reading can be traced to user error and unfortunately the general population isn’t smart enough to be trusted with the technology yet. Hopefully Elon can continue to stay ahead of the morons who are creating bad examples of the wonderful technology he’s created.

lilbean | 26. toukokuu 2018

I felt this way from the beginning. Thank you for mentioning it. :)

Tesla-David | 26. toukokuu 2018

Yes, I agree totally!

Mike83 | 26. toukokuu 2018

Each state has a driver's exam and requirements. Perhaps those owners with a bad driving record like DUI, etc should be denied using AP but that might be too harsh. Yet most accidents are in ICE vehicles and 102 deaths each day in ICE cars. Actually the very few accidents in Teslas are anomalies and even those in these accidents rarely get hurt.
It is the news media that amplifies one accident a 1000 times and is not based on reality but is clickbait for money.

jordanrichard | 26. toukokuu 2018

I forget who is ‘twas here that said this first, but, a Tesla is a thinking person’s car. Some people are just not equipped.....

SCCRENDO | 26. toukokuu 2018

Perhaps climate change deniers don’t have the intellect to use EAP???

SCCRENDO | 26. toukokuu 2018

And perhaps Trump voters??? But they may be the same as climate change deniers.

carlk | 26. toukokuu 2018

Or just let natural selction to do the work?

Sweetride | 27. toukokuu 2018

I love Tesla, and we have both S and 3 models. But I have to disagree with the sentiment in this post.

When Google began experimenting with autonomous vehicles, they limited it to only Google employees. They gave them strict instructions of what to do and not do when driving the test vehicles. But Google found that despite training and education, the drivers did things they were not supposed to do, and the result was accidents. Despite warnings, the drivers put too much faith in the driving assists and let their guard down as they became more comfortable even though they were told to be always on their guard and alert to unexpected things occurring. So Google took away the steering wheel, so that the drivers had no chance to go beyond the limitations.

I don't think there is any evidence that the drivers involved in Tesla autopilot crashes were not intelligent. I do think often they had become comfortable with the amazing capabilities of autopilot and as a result did not pay attention 100 percent of the time and allowed their attentions to drift as generally autopilot handles things magnificently.

Hopefully these well publicized cases will make people more alert. But I don't think the drivers were stupid.

Yodrak. | 27. toukokuu 2018

"I don't think there is any evidence that the drivers involved in Tesla autopilot crashes were not intelligent. I do think often they had become comfortable with the amazing capabilities of autopilot and as a result did not pay attention 100 percent of the time and allowed their attentions to drift as generally autopilot handles things magnificently."

I'm inclined to generally agree. But then I'm puzzled by this:
"Despite warnings, the drivers put too much faith in the driving assists and let their guard down as they became more comfortable even though they were told to be always on their guard and alert to unexpected things occurring. So Google took away the steering wheel, so that the drivers had no chance to go beyond the limitations."
How can drivers not be beyond "the limitations" without a steering wheel? There must be more to it than that.

Sweetride | 27. toukokuu 2018

Basically without a steering wheel, drivers were powerless to affect the car's driving, whereas previous versions allowed human intervention.

GM is also limiting cars by geofencing them to known freeways where they know their system can handle it, and having cameras focus on the driver's eyes to detect that they are paying attention. I'm not sure this is any better than Tesla's requiring hands to be on the wheel, but I guess the common theme is an attempt by manufacturers to limit drivers' ability to use (misuse) the system.

I guess extremely reliable full self driving can't get here soon enough.

K-D-R | 27. toukokuu 2018

@Sweetride - I disagree. Despite the excitement, and as excited as I am to pick up our new MS, this technology has been around in various forms since we bought our Toyota Sequoia in 2008. Our kids Hondas both have elements of it, as does our Porsche.

I’m the only one in the family who uses these features and despite how well our 2008 Sequoia worked, there are times I purposely dont trust the cars and don’t use it.

Call it lack of intelligence or inexperience but if people were truly paying attention and learning their cars as opposed to relegating judgement to a computer I have a feeling many if not all of these accidents would have been avoided.

Sweetride | 27. toukokuu 2018

Humans by nature do not always follow best practices, unfortunately. Doesn't make people stupid. Just human.

carlk | 27. toukokuu 2018

Animals follow instints human do much better. That's the only reason why we are here doing what we do today. Anyone who wants to go back to live cave live be my guest.

Mike83 | 27. toukokuu 2018

They should put eye tracking in non EAP cars first if that's what invasive pundits wish. 3500 accidents each day in these ICE vehicles as well as 500 fires each day. The few accidents in EAP cars are not due to EAP which prevents accidents.

K-D-R | 27. toukokuu 2018

Not sure I can articulate this, but I wonder how any accidents are happening with automated driving cars as a result of the driver overreacting too late, hitting the brakes and removing the cars ability to take over and not having enough time themselves to take corrective action.

Mike83 | 27. toukokuu 2018

If there was one tiny accident with a Tesla near buy it will make the news so in reality it is very small

wisam.alrawi | 28. toukokuu 2018

Really hard to implement practically. In my opinion, the best solution is for Tesla to release an enhanced version of the Autopilot already that Elon was talking about. I remember from Tesla news that it's still being beta tested but a release to greatly improve it can reduce the probability of accidents like the ones happened before. Really hard to protect against ignorance folks.

I'd say also that Tesla should do everything possible to prevent a front collision. If you have sonar and you detect an object like a firefighting truck, the Autopilot should take actions to prevent a front collision regardless of computer vision. Just saying.

michael | 28. toukokuu 2018

The reality is that Humans can't be trusted. Period. So for Tesla to put the onus on the driver to "babysit" EAP, even given the software warnings/acknowledgements/legalese, it's not good enough. I think what will likely happen is that the government is going to bring the hammer down on Tesla and force them to disable EAP (even as a beta feature) except for the traffic-aware cruise control functionality (i.e. no active steering). The feature will simply be unavailable until Tesla can get regulatory approval for level 4/5 autonomy. Now, obviously, TACC by itself is not worth a $5K up-charge so Tesla will also have to throw in the TACC feature for free to be competitive with every other modern car out there (most of which are far cheaper to boot). I also think "Autopilot" branding will be dropped (until level 4/5 autonomy that is).

lilbean | 28. toukokuu 2018

Maybe the IQ test isn't such a good idea. People with a high IQ could be really dumb behind the wheel. Then when the high IQ person crashes, they will blame Tesla and claim that they passed the test so it's Tesla's fault.

nadurse | 29. toukokuu 2018

Id like to know of the people who complain about autopilot being dangerous/mislabeled /misrepresented if theyve actually used autopilot.... because for me my opinion changed drastically having used it first hand, and I dont think you can logically discuss the system if you have no experience using it.

I think that Tesla has done an acceptable amount of warnings, statements, etc that any reasonable person who uses it understands what the system is and isnt. Frankly, if people use it beyond its capabilities I think that they have to accept the responsibility, not Tesla. To me its readily apparent that I still have the overall control of the vehicle but am simply delegating the mundane features of driving to the autopilot system: staying in the lane, keeping distance to the car in front of me, keeping speed, and changing lanes only when I tell it to. Any time there are odd obstacles or on-ramps and off-ramps I take back full control.

However if I expect it to do something well within its limitations and it doesnt, then I think the responsibility falls to Tesla because you cannot reasonably conclude that people wont fall into a sense of security. An example of this would be if the highway curves and the car keeps going straight going over the line, or if its following a car and it doesnt adjust its speed to keep the set distance, those are the two primary functions of the system and if it doesnt work then something is wrong with the system. From the incidents I have read about, this was not the issue, it was that people were going beyond the capabilities of the system or having complete negligence and ignoring warnings.

Maybe instead of an IQ test they can implement a Common Sense test.

SamO | 29. toukokuu 2018

I have called people idiots for using the system in situations that are NOT covered (local streets, merging lanes) or for doing things while autopilot was on (texting, sleeping, watching movies) but, in fairness, regular drivers do this stupid stuff all the time and end up crashing their cars.

This is just normal stupid . . . ie. . . . you can be a regular smart person AND be overconfident in both your ability and the ability of the car to keep you safe.

Nothing new under the sun.

LUMadDog | 29. toukokuu 2018

This may of been posted before, but here is a relent survey in the autopilot area:

https://hcai.mit.edu/tesla-survey/

michael | 29. toukokuu 2018

Hence my statement that humans can't be trusted period. And because people can't guaranteed to be trusted it's only a matter of time before regulators force Tesla to disable the feature until level 4/5 autonomy is green-lit and unreliable humans are cut out of the picture altogether.

Mike83 | 29. toukokuu 2018

The basic evidence that EAP has saved many besides my wife and I from accidents vs. people driving seals in cement it is much safer. And if humans can't be trusted than all vehicle driving must be stopped. Quite a crazy argument

Sweetride | 29. toukokuu 2018

@mike83, curious how EAP saved your life...what about the system saved you that driving diligence would not have been sufficient. I say that as a baseline for using EAP is driving diligence so what happened that EAP did better than your diligent driving skills alone? Thanks

Mike83 | 29. toukokuu 2018

Several times on the freeway while in the 2nd to slowest lane our car slowed down when car stopped in the slow lane that we couldn't see, the radar did and prevented us getting side hit. There are other times the car slows down to prevent a collision. We use it 98% of the time and in a year and a half we find it lowers our stress level. It is like happening a second driver who warns you and also does the steering, braking and accelerating. It is amazing,
Our insurance rates even dropped 45% with 2 Tesla. They know while the ICE liars try new FUD. Dumb.

Sweetride | 29. toukokuu 2018

I’m glad you love EAP. I don’t understand your example though. I thought EAP currently does a poor job reacting to activity in adjacent lanes.

Mike83 | 29. toukokuu 2018

Don't believe what you read in the news or on forums. Drive the car and find out. The short gamblers put out so many lies one must review the source and follow the money. Pretty simple. Our car sees vehicles all around the car and reacts and keeps getting better. I love the Summon and self parking also.

michael | 29. toukokuu 2018

Traffic-Aware Cruise Control (TACC) and emergency braking are standard features on most cars these days - the active steering part of EAP is what's causing most of the issues.

Mike83 | 29. toukokuu 2018

Not driving a Tesla with EAP makes your points fantasy as they are so wrong.

Remnant | 29. toukokuu 2018

@michael (May 28, 2018)

<< The reality is that Humans can't be trusted. Period. So for Tesla to put the onus on the driver to "babysit" EAP, even given the software warnings/acknowledgements/legalese, it's not good enough. >>

I concur, but the Driver must be included by the EAP programmers, as the multifaceted, complex entity it actually is, rather than the clown holding the bag for anything that goes wrong.

SO | 29. toukokuu 2018

If these Tesla owners who misuse AP cared at all about the company, they would be sure to use it the right way. AP is still relatively new and could easily be derailed by public outcry due to these articles.

Remnant | 31. toukokuu 2018

@SO (May 29, 2018)

<< If these Tesla owners who misuse AP cared at all about the company, they would be sure to use it the right way. AP is still relatively new and could easily be derailed by public outcry due to these articles. >>

As Michael said two days ago, the EAP Tesla owner has no EAP babysitting obligations towards Tesla. It's the latter who promised Level 4/5 hardware to be followed within a reasonable amount of time by an adequately designed.SW. And that cannot be just IQ, but a much more complex Driver, of many features, functions and parameters.

SCCRENDO | 31. toukokuu 2018

@Remnant. When we adopt new technologies we need to learn how to use it. EAP is an extremely useful tool. But it still requires a driver to pay attention. Your IQ does not have to be any higher than anyone else driving a car. But if you do not have the capacity or willingness to learn how to do either there is a higher risk of a problem. Indeed, when used as an adjunct to safe driving it provides additional safety and a much more relaxing drive.

Rumi11 | 31. toukokuu 2018

At this point, I have to say i agree. It really is just getting ridiculous at this point. :-/

Remnant | 02. kesäkuu 2018

@SCCRENDO (May 31, 2018)

<< When we adopt new technologies we need to learn how to use it. EAP is an extremely useful tool. But it still requires a driver to pay attention. >>

You are conflating two distinct issues:

(1) Driver should learn to use new tech EAP.

OK. No objection. I have long advocated for a Learning Mode as a transition to the Full Use Driving Mode, or Owner's Mode. (The Learning Mode could be the Valet Mode.)

(2) For the final Level 4/5 EAP (or FSD), the EAP Programmers should include enough Driver info, so as not to resort to a "hands-on-the-wheel" requirement for the actual, live Driver, in order to address the extreme hazards currently provided for in that manner.