Pages

Tuesday, 23 April 2019

How self-driving cars 'see'

Building reliable vision capabilities for self-driving has been a major development hurdle for autonomous vehicles. Today however, a variety of sensors can be used to “see” better than human eyesight, says NVIDIA in a blog post.

The key is to have both diversity in the types of sensors, and redundancy to increase detection accuracy so that the car can navigate and detect the shape, speed and distance of nearby objects.

The main sensors for autonomous vehicles are cameras, radar and lidar. In addition, sensors known as inertial measurement units help track a vehicle’s acceleration and location.

Autonomous vehicles typically rely on cameras placed on each side to create a 360-degree view of their environment. Some have a wide field of view — as much as 120 degrees — and a shorter range. Others focus more narrowly to provide long-range visuals. Some cars may even use fish-eye cameras, which contain super-wide lenses that provide a panoramic view. These can be useful for parking.

Though they provide accurate visuals, cameras have limitations. They can distinguish objects, but cannot figure out distance. They are also less useful in low visibility conditions, like when there is fog, rain or at night.

This is where radar sensors come in. They supplement camera vision in times of low visibility. Radio waves are transmitted outward, and their properties when they return to the sensor provide data about the speed and location of any objects within range.

Source: NVIDIA blog post. Graphic of a self-driving vehicle.
Source: NVIDIA blog post. Graphic of a self-driving vehicle.

To distinguish between different types of vehicles, lidar is needed. This technology measures distances using lasers emitted at very fast speeds. It provides shape and depth to surrounding cars and pedestrians as well as road geography, and is not affected by low-light conditions. On the down side, they cost more than cameras and radar, and have a more limited range.

While capturing the sensor data is all very well, self-driving cannot happen until the sensor inputs are fed into a high-performance artificial intelligence (AI) computer such as the NVIDIA DRIVE AGX platform. This is the brain that makes sense of everything so that the car can make the decisions to drive intelligently and safely.

In fact, the self-driving car can be safer than a human driver. A driver only checks blind spots when changing lanes, and has to keep watching the road ahead. The DRIVE AGX platform checks all round the vehicle all the time. 

While NVIDIA has the autonomous driving platform, it relies on sensors from NVIDIA partners to gather the data it needs. Metawave delivers radar sensing technology; Velodyne lidar sensors detect objects with laser pulses; and Sekonix makes autonomous driving camera sensors.

No comments:

Post a Comment