Home United States USA — IT How Driverless Cars See the World Around Them

How Driverless Cars See the World Around Them

436
0
SHARE

On Sunday, a woman in Arizona died after being struck by a self-driving car operated by Uber. Here is a guide to how autonomous vehicles operate.
On Sunday night, a woman died after she was hit by a self-driving car operated by Uber in Tempe, Ariz. The car was operating autonomously, though a safety driver was behind the wheel, according to a statement from the local police.
Uber is one of many companies now testing this kind of vehicle in Arizona, California and other parts of the country. Waymo, the self-driving car company owned by Google’s parent company, Alphabet, has said it is also operating autonomous cars on the outskirts of Phoenix without a safety driver behind the wheel. On Monday, Uber said it was halting tests in Tempe, Pittsburgh, Toronto and San Francisco.
Here is a brief guide to the way these cars operate.
When designing these vehicles, companies like Uber and Waymo begin by building a three-dimensional map of a place. They equip ordinary automobiles with lidar sensors — “light detection and ranging” devices that measure distances using pulses of light — and as company workers drive these cars on local roads, these expensive devices collect the information needed to build the map.
Once the map is complete, cars can use it to navigate the roads on their own. As they do, they continue to track their surroundings using lidar, and they compare what they see with what the map shows. In this way, the car gains a good idea of where it is in the world.
Lidar also alerts the cars to nearby objects, including other cars, pedestrians and bicyclists.
Lidar works pretty well, but it can’t do everything. It provides information only about objects that are relatively close, which limits how fast cars can drive. Its measurements are not always sharp enough to distinguish one object from another. And when multiple autonomous vehicles drive the same road, their lidar signals can interfere with one another.
Even in situations where lidar works well, these companies want backup systems in place. So most driverless cars are also equipped with a variety of other sensors.
Cameras, radar and global positioning system antennas, the kind of GPS hardware that tells your smartphone where it is.
With the GPS antennas, companies like Uber and Waymo are providing cars with even more information about where they are in the world. With cameras and radar sensors, they can gather additional information about nearby pedestrians, bicyclists, cars and other objects.
Cameras also provide a way to recognize traffic lights, street signs, road markings and other signals that cars need to take into account.
That is the hard part. Sifting through all that data and responding to it require a system of immense complexity.
In some cases, engineers will write specific rules that define how a car should respond in a particular situation. If a Waymo car detects a red light, for example, it is programmed to stop.
But a team of engineers could never write rules for every situation a car could encounter. So companies like Waymo and Uber are beginning to rely on “machine learning” systems that can learn behavior by analyzing vast amounts of data describing the country’s roadways.
Waymo now uses a system that learns to identify pedestrians by analyzing thousands of photos that contain people walking or running across or near roads.
It is unclear what happened in Tempe. But these cars are designed so that if one system fails, another will kick in. In all likelihood, the Uber cars used lidar and radar as well as cameras to detect and respond to nearby objects, including pedestrians.
Self-driving cars can have difficulty duplicating the subtle, nonverbal communication that goes on between pedestrians and drivers. An autonomous vehicle, after all, can’t make eye contact with someone at a crosswalk.
“It is still important to realize how hard these problems are,” said Ken Goldberg, a professor at the University of California, Berkeley, who specializes in robotics. “That is the thing that many don’t understand, just because these are things humans do so effortlessly.”
These cars are designed to work at night, when some sensors can operate just as well in the daytime. Some companies even argue that it is easier for these cars to operate at night.
But there are conditions that these cars are still struggling to master. They do not work as well in heavy precipitation. They can have trouble in tunnels and on bridges. And they may have difficulty dealing with heavy traffic.

Continue reading...