Forums

Why Tesla Is Betting On Cameras For Full Self-Driving

Why Tesla Is Betting On Cameras For Full Self-Driving

I almost never look at Seeking Alpha articles as they are usually garbage created by Tesla shorters. This one caught my eye and is overall positive and brings up some excellent points, especially about LIDAR. One of the more reasoned articles I've come across.

https://seekingalpha.com/article/4106093-tesla-betting-cameras-full-self...

Madatgascar | 12 septembre 2017

Why I am betting against cameras for full self-driving:
1. They will never know what to look for the way a human brain does.
2. They won't stay clean - human eyes blink constantly to stay clean.
3. They can't make eye contact with other drivers or pedestrians, or exchange hand signals.
4. I've been using AP for two years now, limitations of the forward cameras are really obvious.

I just haven't figured out how I can make money betting against this. It's so obvious to me that FSD is impossible in the near term. I don't want to bet against TSLA, because I know they will do incredible things. I think that by pushing for the impossible, Elon and his team will develop the best possible. People are inclined to appreciate innovation and (especially in the case of Tesla) forgive any shortcomings. But I would place a pretty big bet against FSD by 2019. Even 2029 would surprise me.

1jetskier7 | 12 septembre 2017

"There are several moving parts to the debate about LIDAR. The first and least controversial question is whether LIDAR is better than no LIDAR, all else being equal. Everyone agrees on this. If cheap, compact LIDAR existed today that didn't impact a car's appearance or add significant cost, my guess is that Tesla would include it in its cars. My guess also is that once cheap, compact LIDAR is available, Tesla will add it to its cars."

So currently, profit trumps safety?

According to the article. LiDAR is somewhat ineffective under the worst possible weather conditions. Vision will be better under those conditions? Then why don't my automatic wipers work? They are based on vision (this year only)... Please explain

johnson.todd.r | 12 septembre 2017

I had to laugh at this part of the article: "To be fair, with different input — the high-resolution color vision of the human eyeball — humans might be able to spot the semi truck. But given the same input, computer vision outperforms humans."

So... what this is saying is humans are better than computer vision, unless they get the same impoverished input that computer vision gets. I take that to mean that humans ARE better.

Frank99 | 12 septembre 2017

Here's a different take:
https://cleantechnica.com/2016/07/29/tesla-google-disagree-lidar-right/

Takeaway: Current Lidar systems are expensive ($70K). Many startups are trying to build $250 Lidar units, but they have short range, and aren't yet available.

Pungoteague_Dave | 12 septembre 2017

None of the current technologies can see or read subtle things like hand signals. Gonna be a very long wait, and no Tesla yet delivered actually has FSD capacity, now or in the future.

Tropopause | 12 septembre 2017

TT,

Thanks for the link. Great article. Those who wish to bet against Elon will be watching from the sidelines.

carlk | 12 septembre 2017

Thanks for the link. I believe what the author said is pretty much why Tesla decided to go this route. Tesla believes the power of machine learning it wants to send students to school as early as possible.

Boonedocks | 13 septembre 2017

If I read that article correctly tho, Tesla doesn’t use the kind of cameras necessary for it to work as described. Maybe HW3?

BigD0g | 13 septembre 2017

Great article and agree with Boonedocks HW3 is going to be necessary, and those of us that jumped on the FSD wagon with 2.0, will be licking their wounds for a few years. But still love the car!

I suspect we'll get 2.0 FSD on highways, but then get prompted before exists for local roads ala Lvl 3 automation. But, without some HW upgrades, I have my doubts now regarding HW2.

I think the reason HW2 suffers so badly on "tight" curves is it essentially can't see them because the Cameras always face directly forward it doesn't pivot to the sides, so it loses track of the curves in tight radiius's as it's always essentially looking forward and can't pivot to the right or the left to see where the road is really going. I guess they'll get this working with HD maps and additional cameras to figure out the radious, but right now we have a long way to go just for EAP+ functionality.

I wouldn't even start the FSD timer until something is delivered from the EAP+ functionality option. And no TACC/AS is not EAP+ that's AP1, EAP+ should be highway transitions and autolane changes (which I think will also be harder without a rear facing radar, but doable with cameras.).

carlk | 13 septembre 2017

BigD0g There are two front looking cameras. One looks ahead and another with wide angle views.

Bob.Calvo | 13 septembre 2017

Hand Signals? It can't detect clowns either. I can't remember seeing either on the road. ;-)

DRFLGD | 13 septembre 2017

Hand signals? As in the one finger salute?

BigD0g | 13 septembre 2017

@Carlk Actually, there is 3, narrow, normal and wide. Currently we only use narrow and normal. I suspect adding the wide will solve the exit ramp scenario.

I guess to be technical there is 5 front face cameras as the 2 b-pillars also face front albeit at an angle.

carlk | 13 septembre 2017

That AP2 can not return the hand signal is its biggest advantage.

SamO | 13 septembre 2017

Yup. Proof positive that cameras cannot see hands. Tesla should just stop selling cars. NOW!

/s

Bob.Calvo | 13 septembre 2017

@ DRFLGD,

My apologies. How could I forget?

https://www.youtube.com/watch?v=usYAqLN3lTs

PatientFool | 13 septembre 2017

i absolutely believe that eventually software will catch up to the point where cameras will be enough. particularly with the advancement of AI. multiple cameras (which could even record in multiple light spectrums) with a computer will eventually outperform humans with two eyes that only face forward. although I also think that it's pretty obvious we'll see additional generations of the hardware with more/better cameras, compute power, and probably additional sensors (even lidar if/when it makes sense).

DRFLGD | 13 septembre 2017

B.C: Too funny. Was up late last night watching that movie.

TeslaTap.com | 13 septembre 2017

@1jetskier7 "So currently, profit trumps safety?"

What profit? Tesla is not profitable. Should Tesla add $100K to the price of the car to add LIDAR today or somehow just eat an extra $100K per car? Are sales going to continue with a dorky rotating dome on the top of the car? Will owners accept a 10% range reduction for the power needed to run all the extra LIDAR electronics and reduced aerodynamics? Ok, this last one is a total guess, but it can't be ignored either.

What most everyone seems to forget is the software and computing power for LIDAR is huge! Even if the LIDAR sensors were free, it still needs a huge processor to deal with the data and turn it into meaningful information. Then the software task is perhaps 10 times that necessary for a vision system. As the complexity grows in both hardware and software, the reliability risks go up dramatically too!

Now one can argue these $100K LIDAR test platforms perhaps do a good job. Since no one can buy one yet, how it really works in the real world is still unknown.

The larger question is still can a LIDAR based system be done at an affordable cost, made reliable, and will it be any better than a vision based system. Good questions for which there is no answer yet.

JAD | 13 septembre 2017

Price is not the main issue. If LIDAR worked now, Maybach's or MB S-classes could sell in huge relative numbers if they were able to self-drive L5 (Regardless of price or ugly dome). Neither technology exists at any price outside of a controlled demo environment. They both have advantages/disadvantages in certain conditions.

The only thing that is absolutely certain, is the companies spending millions/billions on the technology know a lot more about what will work in a year or two than anyone on this forum. Musk still seems very confident on vision working in about 1.5 years, other companies believe in LIDAR in about 5 years. They may both be right or both wrong, but only time will tell, so sit back and watch the world change completely over the next 10 years.

KD7UFS | 13 septembre 2017

I would observe that our entire road system was designed for vision. (Eyes)

kwen197 | 13 septembre 2017

Eyes are great, they allow humans to process lots of accurate data to the brain. Unfortunately the brain is not so great.

The brain suffers from periods of inattention, distraction and worst of all egotistical actions.

dir | 13 septembre 2017

Can LIDAR read road signs? Both the text and pictures?

Madatgascar | 13 septembre 2017

Eye contact and hand signals. Let me explain for people who live in nice suburbs like Palo Alto. When you are pushing your way through a big city intersection crowded with pedestrians, you are constantly making eye contact, calling bluffs, giving and receiving polite gestures, and being a little pushy to make sure traffic stays moving and clears the intersection.

Driving coast to coast - easy. Driving across Manhattan at rush hour - now that would be an impressive feat.

kwen197 | 13 septembre 2017

Madatgascar: Solution only use automation where you think it is safe to do so. In the totally automated world we are heading too, hand signals will eventually disappear.

SbMD | 13 septembre 2017

A good article to read. Thanks, @TT.

Cameras are the way to go, but requires the right engineering. It absolutely can be done.

To the naysayers, think of it this way: how do most people navigate in the world with respect to driving? Vision! Do we use a biological equivalent of LIDAR? No. Would LIDAR be necessary to make the task of driving better, or just enable it? It is being used to help enable driving more than being accretive to the task. You still need cameras

Don't forget that monocular vision individuals can drive. They have different "hardware" and hence require some "software patches" in a biological sense to do this compared to a binocular seeing person.

As for hand signals, and other ways to obtain additional road info, don't forget what else is coming: V2V and V2I communication. Hand signals and other "cues" will be supplanted by more exacting communication.

Sam100DD | 13 septembre 2017

@ dir LIDAR Cannot read signs in any way.

1jetskier7 | 13 septembre 2017

@TeslaTap.com Please stop providing misinformation.

A LiDAR suite does not cost $100K. The only unknown apparently is you not knowing.

"Now one can argue these $100K LIDAR test platforms perhaps do a good job. Since no one can buy one yet, how it really works in the real world is still unknown. The larger question is still can a LIDAR based system be done at an affordable cost, made reliable, and will it be any better than a vision based system. Good questions for which there is no answer yet."

Looks like Mercedes does know the answer and is buying them for production vehicles.

“We’re pleased to have our LiDAR technology included in Mercedes-Benz’s sensor setup for their fully automated and driverless vehicles. It reinforces Velodyne’s leadership in the space and further ensures that tomorrow’s autonomous vehicles are as safe and efficient as possible,” said David Hall, Chief Executive Officer, Velodyne LiDAR. “Mercedes-Benz is synonymous with outstanding performance as well as engineering quality, and we welcome the opportunity to contribute to such an exciting program.”

In order to bring autonomous driving within reach, Mercedes-Benz has developed an intelligently integrated setup of different sensors that will now include Velodyne LiDAR sensors. The so-called ‘sensor fusion’ enables a continuous situational analysis of the combined data from the various sensors. The goal is to guarantee reliable results to allow a robust planning of safe trajectories for automated vehicles. Velodyne has now begun producing automotive units at their new manufacturing plant in San Jose, CA.

1jetskier7 | 13 septembre 2017

Looks like Ford and Baidu know:

"Elon Musk may not think that lidar is a necessary sensor for self-driving cars, but pretty much everyone else in the inudstry disagrees. With its long range, impressive accuracy, wide field of view, and near-immunity to the vagries of ambient light and weather, lidar offers a volume of high quality data that's hard to get with cameras or radar. Companies like Ford and Baidu see lidar as an integral sensor on their near-future autonomous car projects, and to support that vision, they've just invested US $150 million in Velodyne, the company that makes the best lidar sensors on the planet."

1jetskier7 | 13 septembre 2017

Hmm, and Audi, Volvo, Subaru, Nissan, Toyota, GM, Waymo, Samsung, Uber, Lift, etc all have LiDAR in their solutions. Wonder what regulators will require or the NTSB will recommend for added safety. Or I could be wrong.

"For its part, Velodyne has promised to build a solid-state lidar device, which John Eggert, director of automotive sales and marketing, says will use 32 laser lines and boast a range of 200 meters. And Israeli startup Innoviz Technologies claims to be making a $100 unit with a range of 200 meters and an angular resolution of 0.1°. Both firms have promised to put those sensors into production sometime in 2018. Quanergy, a Silicon Valley startup, is building its own $250 solid-state device due to go into production later this year."

Yes, I stand by my statement, So currently, profit trumps safety?

Tropopause | 13 septembre 2017

The space industry experts also told Elon a reusable rocket was impossible. Here's a man that keeps proving the status quo wrong time-and-time again, yet people continue to bet against Elon.

jgardner | 13 septembre 2017

It is not so much a hardware issue as it is a software issue in my opinion. The software needed to enable FSD is unbelievably complex regardless of the implemented hardware. The company with the best software / neural network approach and software engineering team will be the winner. Hard to say when that is going to happen, but it is likely several years away.

1jetskier7 | 13 septembre 2017

I don't think it's so much "people continue to bet against Elon" it's more about the safest working solution will become the standard.

Autopilot is not fully working or even the safest. So how can that realistically turn into FSD with some software updates? A lot has happened in this story over the last year. Right?

1jetskier7 | 13 septembre 2017

HW1 and HW2 never achieved parity for EAP. Now HW2.5, HW3 then HW4, HW5 before FSD. Looks like the software equation continues to get more complex just to achieve parity between the hardware versions. Every software update already has 5 or 6 versions just for current hardware updates. Face it, HW2 is a total loss for FSD. IMHO

Tropopause | 13 septembre 2017

A lot of assumptions there, jetskier.

Tropopause | 13 septembre 2017

The way I see it- if Tesla can demonstrate a cross-country FSD trip. They have demonstrated the ability to FSD along nearly all possible scenarios. Elon is still shooting for end of this year or shortly thereafter. Will be an amazing accomplishment and shake the foundation of the competition and their lidar principles.

Haggy | 13 septembre 2017

"1. They will never know what to look for the way a human brain does.
2. They won't stay clean - human eyes blink constantly to stay clean.
3. They can't make eye contact with other drivers or pedestrians, or exchange hand signals.
4. I've been using AP for two years now, limitations of the forward cameras are really obvious."

Each camera will know what to look for and won't have to worry about things that other cameras have to look for. That will be for the computer to take into account.

The front cameras are behind the windshield, and the windshield has wipers. I haven't had a problem with the cameras since I need to see out of the windshield, but in theory Tesla could activate the squirters and wipers automatically if needed. On most cars, I never have to clean the rear camera lens. On the Model S, it tends to get spots from rain, but nothing that makes it unusable. A good design can keep the lenses clear. I don't know if Tesla ever addressed the issue with the Model S rear camera, but if they didn't, they should.

They can't make contact with other drivers, so people will have to rely on the rules of the road. It might be tough when your car is part way into another lane and a car speeds up to try to block you, but by definition, yours is the car in front. The other driver will stop rather than hit you, and possibly learn that he should avoid challenging self driving cars that are following the law. My current Model S can't make eye contact either, but it has gotten pretty good at merges. If somebody ahead of me did make eye contact, I could gesture them to move in quickly, thus assuring that my car knows about it sooner rather than later.

I've been using AP for as long as it has existed and it keeps getting better. Nevertheless, FSD has parts that are being developed independently that don't rely on lane lines, so I don't think we can compare the two.

Boonedocks | 13 septembre 2017

If Tesla indeed completes a cross country FSD demo, their year or next, I sure hope they do it in a HW2 car to prove that the HW2 forward will work. If they do it in a HW2.5-3......I think they will have a HUGE....UGE....problem on their hands

1jetskier7 | 13 septembre 2017

Agreed @Boonedocks

Although, I have already seen a video on how Tesla FSD works. That was a year ago. What does another video really prove?

I've ridden in a Google / Waymo vehicle in FSD. That actual demonstrates something. Can't say the same for a Tesla FSD test ride.

BigD0g | 13 septembre 2017

@boonedocks "if" that video is ever made, I seriously doubt they are going to let you know what HW it was done with.

P.s. There's not a chance in hell that video happens this year.

inconel | 13 septembre 2017

I am continuously amazed at the competency levels of my fellow posters in this forum. We seem to know better than Elon and his engineers. This forum is great!

phil | 13 septembre 2017

Tropopause | September 13 "The space industry experts also told Elon a reusable rocket was impossible. Here's a man that keeps proving the status quo wrong time-and-time again, yet people continue to bet against Elon."

Is this meant to be a joke? The space industry experts not only designed a reusable rocket in 1969, before Elon was born, they launched it - and reused it - hundreds of times, beginning in 1981. It was called the Space Shuttle, Aerojet Rocketdyne RS-25. You might as well tell us that Elon Musk invented fire.

https://en.wikipedia.org/wiki/Reusable_launch_system

RedShift | 13 septembre 2017

@haggy

I don't believe that 'a good design can keep the lenses clear'. We have eyelids to keep our eyes clean. Unless cameras have a sprayer and a small wiper built in, (those that aren't behind the windshield) I don't see HOW the system will work reliably. Having those systems will make it more complex and more parts prone to failure, but I cannot imagine a robust system without those in place.

Also, I have had visibility issues multiple times in the winter with my Model S' rear view camera. Just from a rather large water drop on the camera. Some other cars have a cover for the rear camera. BMW hides it behind their roundel, in fact. Not the Model S.

Mchenry7 | 13 septembre 2017

I would have more faith in the front cameras seeing things more clearly if rain sensing wipers actually worked as advertised.

DonS | 13 septembre 2017

LIDAR does provide a nice set of 3D data. Cameras require more work to figure out 3D from 2D images, but it is possible. Overall, autopilot is great for well defined roads and vehicle interactions, but it is way more than a decade to solve the fuzzy areas. Interactions with pedestrians in crowded cities, or driving on a snow or water covered road are cases that self driving cars haven't even begun to figure out. I don't see LIDAR being much help in these difficult areas.
I won't be buying a vehicle that doesn't even have a steering wheel. A tow truck is not an acceptable solution to conditions not previously seen by the software developers.

Pungoteague_Dave | 13 septembre 2017

@Bob.Calvo "Hand Signals? It can't detect clowns either. I can't remember seeing either on the road. ;-)"

@DRFLGD "Hand signals? As in the one finger salute?"

@carlk "That AP2 can not return the hand signal is its biggest advantage."

@SamO "Yup. Proof positive that cameras cannot see hands. Tesla should just stop selling cars. NOW!"

Seriously guys? Not a day goes by when I do not see and use hand signals when driving, riding a motorcycle, or cycling. Please think again - construction flaggers (had about five today), courteous drivers in parking lots, pedestrian interactions, etc. There are many other examples - many city roundabouts in the EU have police with white gloves directing traffic.

The example that I believe trumps FSD for a very long time: Picture a FSD Tesla driving up the two-lane PHC in a curvy section. The car comes upon a line of six road bicycles doing 18 mph. The car can either hold up traffic and ride along at 18, or go around. But it is a long line of bicycles and there's sporadic oncoming traffic and limited passing gaps painted on the road. The second-from-last bicycle rider hand signals to come around and leaves a gap for the car. As a bicyclist, I do this all the time (signal cars either my intention, or what they can do to get around me. So yeah, hand signals are a pretty important thing in driving. Ask any cop.

A human can see and analyze real world targets and moving elements and consider relative speeds and the varying importance of those inputs, including the subtleties of hand/finger meanings, and I do not believe those are susceptible to machine learning in the same way. Talk about deep learning and supercomputers all you want - my daughter beat Watson at Jeopardy (one of two in the world to have done so) - so the machine is fallible too. In many cases it is perhaps less likely to make a mistake than the human, but we're not going to accept the machine killing the bicyclist even if is statistically the correct thing to do.

georgehawley.fl.us | 13 septembre 2017

FSD is going to be very difficult to develop to a level of reliability that will satisfy owners and regulators. Proponents will argue that, while not perfect, FSD is safer than the average human driver but every accident involving FSD will receive outsized publicity, contradicting the statistics. The image processing software will not only have better accuracy and reliability under diverse conditions than above average human vision but will also have to incorporate AI content that uses, among other things, defensive driving techniques like anticipating things that could go wrong and having a plan of action like an escape route or interpretation of see ahead events and extrapolating to possible consequences, then taking action to avoid an accident.

I remember as a 16 year old being asked by my Dad to take the wheel of the family car on the way home from vacation. We were in a line of closely packed traffic on a 2 lane highway on the way out of Ames, Iowa. We were moving at maybe 30 mph. We had no directional signals, so I rolled down the window for arm signals. Looking ahead I saw a semi turn left out of our lane into a restaurant parking lot. About 4 cars closer to us was a matching semi who suddenly turned left to join the other truck. The car behind the second truck slammed on his brakes started a chain reaction of hoods and trunks flying open. We were another 10 cars back. I saw the truck turn and started to slow at once. I gave the driver behind us a hand signal to show that we were stopping. We stopped short of the guy in front of us and the guy behind us stopped short of rear-ending us but got rear-ended himself. We were the only car in the line of cars behind the truck that was not involved in the accident. My a Dad was really happy.

Anecdotal yes but a real story. There are billions of combinations of things that will have to be individually programmed into FSD systems to avoid accidents. Any one left out will be a hole in the system, an accident waiting to happen. How will management know that the software is good enough? What will be the boundary conditions for satisfactory results.

Another anecdote: recently I was driving on a multi-lane interstate highway. There was a pickup truck in front of me with a stepladder in the back and the tailgate down. I didn't like the looks of the ladder which was moving. I changed lanes and, as I did so, the ladder slid out of the truck. Will that be a boundary condition?

Pungoteague_Dave | 13 septembre 2017

@SbMD "As for hand signals, and other ways to obtain additional road info, don't forget what else is coming: V2V and V2I communication. Hand signals and other "cues" will be supplanted by more exacting communication."

@kwen197 "In the totally automated world we are heading too, hand signals will eventually disappear."

Really, and what about bicycles, tractors, pedestrians? Gonna implant comm devices in them too? People, there's a real complicated world out there, made of up mostly unknowns for which there are no "rules", only behaviors. Maps solve maybe 5% of the navigation equation. Maybe it's because I do most of my annual mileage on unpaved roads in the third world, but until FSD can handle driving a dirt road in Tanzania at night, with random zebra and hyena crossings, it isn't going to be a thing except maybe in laces like big cities. Remember that 90% of the world's roads are unpaved, as are over half of the road miles in the U.S. You guys in New York and California do not live in the real world. Where I ride/drive much of the time, people intentionally throw their livestock in front of cars because the law says they must be reimbursed, and it' easier than going to market and they can keep the carcass. Has happened to me six times.

@BigD0g "I suspect we'll get 2.0 FSD on highways, but then get prompted before exists for local roads ala Lvl 3 automation."

Isn't it true that Elon says current FSD hardware and future software will allow Tesla to form an Uber-like service that uses these cars unattended to make the car payments for owners? I believe that was and remains a very specific promise to CURRENT owners of 2.0 cars. FSD means: put your kid in the car and send it to the school drop-off, unattended. Or that Tesla summons the car for service, unattended. In no world does FSD imply any driver interaction with the controls, other than indicating destination. Nothing less.

1jetskier7 | 13 septembre 2017

@Pungoteague_Dave: you are 100% correct and that makes me super sad. Model 3 ride hailing service should begin to work 1 mile at a time over the next 5-15 years. Most likely starting in Fremont. Which I drive by every day.

So I should be able to start use it in my 2018 M3 around 2022 in Palo Alto, CA. So boo hoo for most of you.

SbMD | 13 septembre 2017

@PD - Systems are being designed to understand other complexities of negotiating an environment, so that we will not be held to a "technological trap" of being unable to understand a hand signal or other cues.

There are a number of white papers on this topic, so nothing proprietary being alluded to by saying this.

Besides, in a V2V/V2I enabled setup you won't need hand cues, but I can fathom how you would get a machine to recognize specific signals if not be able to provide those signals with something akin to a "hand wave".

Furthermore, you "solve vision" for machines and it advances an entire area important to human-machine interactions. (In reality, it is "solve vision and decision-making".) This tech isn't just about cars, but it is the "vehicle" (pun intended) for making a significant technological leap. This is about how you would leverage this technology across many different applications.

Teach a machine to navigate a complex, dynamic environment is technological Holy Grail and not just for cars.

Pages