Why lidar is essential for self-driving cars

An Uber driverless Ford Fusion drives in Pittsburgh, Pennsylvania, with a spinning lidar on top.
Enlarge / An Uber driverless Ford Fusion drives in Pittsburgh, Pennsylvania, with a spinning lidar on top.

Velodyne has dominated the market for self-driving car lidar ever since. Google hired DARPA Grand Challenge veterans to run its self-driving car program, and they put Velodyne units on the early Google cars. Other companies working on self-driving cars bought Velodyne gear, too.

Today, most self-driving cars rely on a trio of sensor types: cameras, radar, and lidar. Each has its own strengths and weaknesses. Cameras capture high-resolution color images, but they can’t measure distances with any precision, and they’re even worse at estimating the velocity of distant objects.

Radar can measure both distance and velocity, and automotive radars have gotten a lot more affordable in recent years. “Radar is good when you’re close to the vehicle,” says Craig Glennie, a lidar expert at the University of Houston. “But because radar uses radio waves, they’re not good at mapping fine details at large distances.”

Lidar offers the best of both worlds. Like radar, lidar scanners can measure distances with high accuracy. Some lidar sensors can even measure velocity. Lidar also offers higher resolution than radar. That makes lidar better at detecting smaller objects and at figuring out whether an object on the side of the road is a pedestrian, a motorcycle, or a stray pile of garbage.

And unlike cameras, lidar works about as well in any lighting condition.

The big downside of lidar is that it’s expensive. Velodyne’s original 64-laser lidar cost a whopping $75,000. More recently, Velodyne has begun offering smaller and cheaper models with 32 and 16 lasers. Velodyne is advertising a $7,999 price for a 16-laser model introduced in 2014. Velodyne recently announced a new 128-laser unit, though it has been tight-lipped about pricing.

Last year, Velodyne announced an order from Ford for a new solid-state lidar design. Velodyne said that it had “set target pricing of less than $500 per unit in automotive mass production quantities.” But it didn’t say how many units Ford was buying, how much Ford was actually paying, or how soon Velodyne expected to reach mass-market scales and price points.

Velodyne can’t afford to rest on its laurels. The company is about to face a lot of competition from rivals building lidar systems with very different designs.

The rise of solid-state lidar

Carmakers expect automotive components to last for hundreds of thousands of miles over bumpy roads and a wide range of temperatures. It’s more challenging to make a system cheap and reliable if it has moving parts, as Velodyne’s spinning lidar units do.

So a lot of experts believe the key to making lidar a mainstream technology is to shift toward solid-state designs with no moving parts. That requires some mechanism for directing laser light in different directions without mechanically moving the laser unit itself.

Researchers have developed three major ways for doing this:

Microelectromechanical systems (MEMS) use a tiny mirror—millimeters across—to steer a fixed laser beam in different directions. Such a tiny mirror has a low moment of inertia, allowing it to move very quickly—quickly enough to trace out a two-dimensional scanning pattern in a fraction of a second.

Two leading startups working on MEMS lidar sensors are Luminar and Innoviz, according to Sam Abuelsamid, an analyst at Navigant Research. Another lidar company, Infineon, recently acquired Innoluce, a startup with MEMS expertise.

Abuelsamid told Ars that one advantage of the MEMS approach is that a lidar sensor can dynamically adjust its scan pattern to focus on objects of particular interest, directing more fine-grained laser pulses in the direction of a small or distant object to better identify it—something that’s not possible with a conventional mechanical laser scanner.

Phased arrays use a row of emitters that can change the direction of a laser beam by adjusting the relative phase of the signal from one emitter to the next. If the emitters all emit light in sync, the resulting beam will point straight ahead. But if emitters on the left-hand side have a phase slightly behind the emitters on the right, the beam will point toward the left—and vice versa. An illustration from Wikipedia shows how this works:

Phased arrays allow non-mechanical beam steering in one dimension. To steer the beam in the second dimension, these systems typically use a grating array that works like a prism, changing the direction of light based on its frequency.

At this point, phased-array lidars are mostly still in laboratories. “I’d say phased array is a cool thing for the future,” says Alex Lidow, the CEO of Efficient Power Converter, which makes lidars that are incorporated in a number of lidar products. “Today it’s spinning disk or MEMS, and spinning disk is by far the dominant.”

Quanergy is one startup reportedly working on phased-array lidar. A key technical advisor to Strobe, the lidar startup GM acquired in October has done research focused on phased-array lidar systems—suggesting Strobe may be working on phased-array technology.

Flash lidar dispenses with the scanning approach altogether and operates more like a camera. A laser beam is diffused so it illuminates an entire scene in a single flash. Then a grid of tiny sensors captures the light as it bounces back from various directions.

One big advantage of this approach is that it captures the entire scene in a single instant, avoiding the complexities that occur when an object—or the lidar unit itself—moves while a scan is in progress. But it also has some significant disadvantages.

“The larger the pixel, the more signal you have,” Sanjiv Singh, a robotics expert at Carnegie Mellon, told Ars. Shrinking photodetectors down enough to squeeze thousands of them into a single array will produce a noisier sensor. “You get this precipitous drop in accuracy.”

Range is a key limitation for automotive lidar

What this means in practice is that flash lidar isn’t well suited for long-range detection. And that’s significant because experts believe that fully self-driving cars will need lidar capable of detecting objects 200 to 300 meters away.

Jim McBride, a Ford executive who organized a team for the 2005 DARPA Grand Challenge, explained why in a September interview with Ars.

McBride said to imagine a self-driving car that wants to merge into traffic moving at highway speeds. “Traffic is moving probably 60 miles per hour, roughly 30 meters a second,” McBride told Ars. “Most cars will take 6 to 10 seconds to get up to highway speed.” So a car will want to be able to see cars that are 6 to 10 seconds—or 180 to 300 meters—away.

Bouncing a laser off an object 300 meters away and then detecting reflection isn’t easy for any lidar system. There are a few different approaches for extending range that are being explored by lidar makers.

Most lasers on lidar sensors today operate in the near-infrared range—905 nanometers is a popular wavelength. Because they’re close to the wavelength of visible light (red light starts around 780 nanometers), too much laser light can cause damage to people’s eyes, frying the sensitive light detectors on our retinas. For this reason, the power level of 905 nanometer lasers is strictly regulated.

So one alternative approach is to use another wavelength that doesn’t create a risk of eye damage. Luminar, for example, is developing a lidar product that uses 1,550nm lasers. Because this is far outside the visible light range, it’s much safer for people’s eyes. As IEEE Spectrum explains it, “the interior of the eye—the lens, the cornea, and the watery fluid inside the eyeball—becomes less transparent at longer wavelengths.” The energy from a 1,550nm laser can’t reach the retina, so the eye safety concerns are much less serious.

This allows 1,550nm lidars to use much higher power levels—IEEE Spectrum says 40 times as much—which naturally makes it easier to detect laser pulses when they bounce off distant objects. The downside here is that 1,550nm lasers and detectors aren’t cheap because they require more exotic materials to manufacture.

Another way to improve the range of lidar units is to increase the sensitivity of the detectors. Argo AI, Ford’s self-driving car unit, recently acquired Princeton Lightwave, a lidar company that uses highly sensitive detectors known as single-photon avalanche diodes. As the name suggests, these detectors are sensitive enough to be triggered by a single photon of light at the appropriate frequency.

These highly sensitive detectors have been used in military and surveying applications for a while. Princeton announced last year that it was working on bringing the technology to the automotive market.

Comments

comments