Forums

What does the AP see at night?

What does the AP see at night?

I use Autopilot over 95% of my driving and I trust AP cameras in the night. I've always assumed the cameras can see well beyond the frequencies that humans can perceive.
Is my assumption correct?

I've seen many videos that depict what AP can see, and I'd love to see such videos for nighttime. Do we have any such videos?

Observing the video from my dashcam footage I feel that the dashcam image isn't really what the camera actually captures, it might just a 'filtered' image for humans to view.

I'd appreciate any information that will help me understand night time AP better. I'm relying on AP at night quite a bit, so the more I know the better.

lbowroom | 5 February 2020

"well beyond the frequencies that humans can perceive"

and what is the source of these wavelengths that the cameras would capture? For example, infrared cameras have an infrared light source shining ahead. Are you surmising that there are invisible to human light sources tucked under the bumper?

Joshan | 5 February 2020

teslamotorsclub.com/tmc/threads/what-tesla-autopilot-sees-at-night.135011/

vmulla | 5 February 2020

@lbowroom,
Same source as the light that is observed by human eyes. There is light beyond the human eye's perception range.
To your point, perhaps there is enough luminescence from that LED lamps in a frequency that humans cannot see but the cameras can?

Also, the word I'm using is 'assume', I'm not surmising :) I'm trying to find out if there is any merit to my assumption.

NKYTA | 5 February 2020

Ghosts.

vmulla | 5 February 2020

Joshan | February 5, 2020
teslamotorsclub.com/tmc/threads/what-tesla-autopilot-sees-at-night.135011/
---
Thanks @Joshan. I feel I should really give TMC a chance.

I like this 1375mlm fellow on TMC :) The TMC thread makes me feel more convinced about AP in dark conditions.

Joshan | 5 February 2020

yes very informative thread. Sometimes TMC is a bit stale, but lots of good info that doesn't get derailed by Fish.

vmulla | 5 February 2020

Last night it was drizzling and a guy was driving on Captial Beltway without headlamps on. He zipped around me while I was trying to make an auto-lane change. There is no way I would have observed that car with my vision. There was enough ambient light to see him if someone was paying particular attention, or if they had sharper vision. I thought I'd ask the question and learn :)

Twochewy | 5 February 2020

You use AP for 95% of your driving? I'd say you're a statistical outlier. I don't trust it nearly enough to use near, or in the city. Heavy traffic + rain + AP isn't a good combination for safety. AP is in Beta. Last thing we need is a bunch of folks using it like FSD (which it isn't) and outside of it's recommended usage.

vmulla | 5 February 2020

@Twochewy,
I don't think I'm a statistical outlier at all. I think this is normal for folks who are comfortable with AP.

I'm not treating EAP like FSD (in the true sense), I learn the limits of the system and engage it quite a bit in the city. Examples:
If I know I'm going to be stuck/crawling at a traffic light for a while I turn on EAP and watch for the intersection.
I often follow other cars using AP on divided highways, I pay attention to traffic and signals but I'm not actively driving.

I'm only intervening when I know the system cannot handle a certain situation, or need to make a turn - otherwise, I just stay in my lane and let AP do its thing.

A big part of using AP responsibly in the city is learning its limitations, and that is what I'm trying to find out in this thread.

Joshan | 5 February 2020

@vmulla +1

bddaughe | 5 February 2020

+2 - I use it all over the place and have found it handles a lot of situations I wouldn't have expected - like making right turns as long as the turn isn't a 90 degree turn. I made a cross city trip the other day by following other cars and making proper lane changes when needed. But I didn't once deactivate autopilot and it was all city street driving. That being said, by following people, it naturally stopped at stop lights when cars in front of me did. But, I was shocked to be able to drive several miles in city conditions without disengaging AP. It's so much closer than people give it credit for.

Twochewy | 5 February 2020

I guess people are comfortable with different levels of risk when they drive. I treat it as a Beta app and never use it in the City. I live in the Seattle area by the way.

kevin_rf | 5 February 2020

Uuuummmm... Slight, but major correction Silicon CMOS sensors typically have a wavelength range 350/400nm to 1050nm and the human eye is ~400nm to ~750nm give or take a few nm.

The low end on the CCD is often limited by the window used. Glass will cut off somewhere around 400ish nm, Below that special window materials (fused silica) and a sensor coatings (phosphors) are used. Air will stop everything below 200nm. Specifically water vapor and O2. Getting below 190nm requires a pure N2 environment or hard vacuum.

The upper end is 1050nm (near IR). This is the hard physics limit of Silicon. Above that wavelength and the photons do not have enough energy to knock electrons off of the silicon matrix. You need different materials (InGaAs, PbS, ect). You often see IR blocking filters that actually limit the upper limit for somewhere closer to the human wavelengths on color cameras. Somewhere between 650an and 750nm. (Weird things happen above 750nm).

That said, the industry standard for color cameras is to place a Bayer filter (2x2 pixel grid of Red,Green, Blue, Green filters) over the sensor. That's over 99.9% of all cameras on the market. Your eye only has Red, Green, and Blue receptors. Works the same way.

So vmulla, Tesla's cameras are most likely limited to the human frequency visual range. What we consider Red Green Blue.

That said, the actual sensitivity of the camera is definitely higher than the human eye. Especially with modern CMOS sensors.

Fun little tibbit, even on the darkest night, there is enough light for a sensitive imaging sensor to see as bright as day. Yes, I spent my whole career working with research grade imaging sensors (ICCD, EmCCD,LN Cooled CCD, CCD, CMOS). Funny, that the Tesla team sent me a rejection letter ;-P

vmulla | 5 February 2020

@kevin_rf,
I love your answer for clarity, depth, simplicity. Thank you

andy | 5 February 2020

@vmulla - I don’t know the technical answer to your question, but the anecdotal answer is “more than me”.

We are just starting to come out of winter here with it being dark at all the peak times when people drive. I’ve (with caution) been supervising Autopilot a lot and have been very impressed with how well it tracks winding narrow roads at night with few road markings and no clearly defined edge.

What the car does struggle with is twighlight, or similar, on a dual carriageway (4 lanes separated by a divider) and no features in the left (passenger) side. The car seems to read the lack of features as the camera being blinded. It seems to be thrown by the different contrast and visibility on each side so some software tweaking to do.

Standard camera optics can struggle with different lighting conditions - I don’t know what the Model 3 uses and whether it is able to use a wider range of wavelengths than human eyes.

andy | 5 February 2020

@Twochewy I treat Autopilot as an advanced driver’s aid.

When you fly a plane you set the trim to that it’ll maintain straight and level flight without control inputs. That frees you up to monitor systems, navigate and keep a good lookout. You are always on top of the plane and ready to intervene. Autopilot is very similar, used correctly it frees you up to better manage the car and maintain situational awareness. You always need to be ready to override, which needs care itself so that you don’t over-correct.

Your role does change, but I also find it adds to overall safety. Real care is needed though - road markings can cause the car to track unexpectedly and you need to be aware of road markings, tar from cars eyes (reflections), low sun, congestion and the behaviour of other vehicles. The manual covers it well IMO.

M3phan | 5 February 2020

Use AP everyday, almost everywhere. 85-90% best guess.
My guess it AP sees and reacts better than me at night, what with the combo of cameras, ultrasonics and radar.

jimglas | 5 February 2020

dead people

vmulla | 5 February 2020

@M3phan,
Exactly!!

Now if we had those nighttime videos that show the edges of roads, outlines of the vehicles, signs etc that should put a nice touch - especially if we can compare it to raw dashcam footage.

casun | 5 February 2020

electric sheep?

M3phan | 5 February 2020

Dead people, lol

kevin_rf | 5 February 2020

Casun, are you dreaming of them?

andy | 5 February 2020

@casun “beware of the sheep” signs are not to be sniffed at.

Haggy | 6 February 2020

At night, the car uses headlights, which should enhance what it can see. You might want to try using a USB stick for the webcam function, and then drive at night in the area you have in mind, such as an unlit road on a cloudy day with no moonlight.

vmulla | 6 February 2020

@Haggy,
I suspect that the images that the dashcam displays on the computer screen are not the same images the computer uses for AP. I think the images have a filter to make them more friendly for human viewing. The simple reason I say that is because some of the images I saw from the dashcam are washed out and do not have clear edges to help AP, and still, the car manages to drive fine.
If someone had the setup I'd encourage them to take a washed-out image from the dashcam and play with the image-processing settings to see if more detail is hidden in the image.

kevin_rf | 6 February 2020

A first order derivative should be able to pull out the edges in most cases...