Self driving car on road

Too many LiDARs and radars might pose a risk to autonomous vehicles

Active sensors such as LiDARs and radars can suffer from mutual interference between vehicles equipped with the same sensor technology. Might this pose a risk to other autonomous vehicles around them?
Self driving car on road

Autonomous vehicles – more often referred to as self-driving cars and trucks – are close to becoming a common sight on our streets. Many companies, automotive insiders, and tech giants are developing the technology needed to make them a reality. From complex driving software to steering equipment, a new industry is growing in the cars of the future.

Among the most important features of these cars are the sensors they use to see and perceive the world around them. These fall into two groups – passive and active sensors. Active sensors project energy into the world and then use the reflections that bounce off the objects back into the receiver to detect what’s there and how far it is. Passive sensors, on the other hand, use energy that’s already in the world, particularly light (from the sun or streetlamps), or heat, to capture color or temperature difference and detect objects. There is a variety of sensors for each type being tested, and a variety of different technologies that could come out on top. But one the most important distinctions to be made in the future of automation is whether the sensors will be active or passive.
There are advantages to each of the types, mainly concerning weather changes and challenging lighting conditions. While active sensors are usually more resilient to these challenges, they carry on their back the burden of health safety concerns and mutual interference between different vehicles equipped with the same type of sensor technology. Mutual interference causing crosstalks may seem redundant now due to small amount of fully-equipped autonomous vehicles on the road. However, imagine a busy intersection and heavy highway traffic where all the cars are emitting millions of laser beams in all directions from each of the 5 to 10 LiDARs on each of the vehicles. Now add 360° radars (short-range and long-range) reflecting off all objects, and in addition, tens of ultrasonic sensors bouncing back from each of the adjacent vehicles.

Multiple sensor interference
Illustration of what a typical highway sensor reflection might look like. This is a very subtle presentation; it will actually be much more overcrowded. The longer the range of the projection, the more congested it will get

There are many ways to mitigate these technological challenges such as frequency modulation and narrow band pass filters. However, there is still a critical point at which false positives alerts or worse, false negative alerts, might cause life-threatening events.

The ultimate solution is, of course, to use passive sensors that don’t emit any radiation or energy into the surrounding, more commonly known as cameras, instead of using emitting sensors. The advantages of cameras are not only the extreme and precise resolution with which they can capture information, but also the fact that they don’t emit energy pulses (in the form of spectral radiation or ultrasonic waves) to acquire that data. Yes, visible light cameras need an external light source to get sufficient contrast between two (or more) surfaces, but it is usually already available since human eyes need it as well (daylight, streetlamps, and headlamps). However, darkness is still an eminent risk factor for driving. According to studies, more than 56% of all fatal crashes occur during nighttime, of which the largest percentage happen during sunset hours . This is clearly the result of impaired vision ability. The technological solution that comes to save the day (and night) is thermal cameras, more particularly long-wave infrared (LWIR) cameras. Recent developments enable faster, clearer, cleaner and more accurate detection of heat differences between surfaces and even temperature difference within the same surface. Thermal cameras collect the thermal energy (heat) emitted from any body or object and, just like regular cameras, create a distinct picture using temperature differences instead of color, enabling a clear image of the world around us even in complete darkness and harsh weather conditions. They do not emit any radiation or energy; hence, they are passive sensors. Combining visible and thermal cameras upgrades the ability of driverless cars appear in the near future and to travel safely on the roads in the most adverse lighting and weather conditions.

Foresight is committed to designing, developing, and commercializing a range of sensor systems and associated technologies for use in autonomous vehicles. These sensors include stereo and quad camera systems using both visible and thermal imaging, as well as proprietary software that will enable a car to interpret the video feed from the cameras in real time. These will be used to help avoid accidents on the road and eventually will enable fully autonomous self-driving cars that can drive in harsh weather and challenging lighting conditions.

Share on:

Share on facebook
Share on twitter
Share on linkedin
Share on facebook
Share on twitter
Share on linkedin
Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments