Perception is another story
It is what causes problems to the functioning of self-driving cars in rainy or snowy conditions.
What is perception about again? Sensors
What sensors are used?
Depending on the strategy or the company, it will use cameras, LiDARs, and RADARs.
Radars work under rain or snow, but the sensor is not really trustworthy alone.
It is generally noisy by essence but can see through obstacles, and through fog or other bad conditions.
LiDARS use laser and light to sense obstacles.
If it rains or snow, impact points will appear everywhere around the car
Because the laser might touch every drop of rain and snow it crosses.
The last sensor we can use with the RADAR is the camera.
Technically, a camera simply retransmits what we see with the eye.
And humans are able to drive on the road in heavy conditions.
I wanted to see if it would really work in a condition where no line is visible and no clear road is defined
I ran a custom freespace algorithm I trained to detect drivable areas.
Here is the result.
It is quite impressive
It can detect my lane even if the camera is filled with snowflakes…
… and the wipes are constantly on…
The next step was to see if obstacle detection would work
And the result didn’t disappoint again
What does it mean?
Self-driving cars rarely rely solely on the camera
So it’s unlikely that we would assume self-driving cars can work under rain or snow based on these two pictures
The camera has a lot of potential
Companies like Tesla are betting big on this sensor
It is the closest we have to human eyes
Some scenarios like Fog will still be a big issue
Making the driving impossible
But rain might be managed thanks to this sensor.