What Does a LiDAR System “See”?

“LiDAR” is “light detection and ranging.” It is a key element to advanced driver assistance systems (ADAS) as well as autonomous driving systems

By Gary S. Vasilash

In case you ever wondered about what a LiDAR system might “see,” here is a picture of a LiDAR point cloud that includes HD resolution and long-distance range.

AEye LiDAR image. (Image: Continental)

It is worth noting that this image, from LiDAR developer AEye, isn’t necessarily the output of all systems.

Each LiDAR system developer essentially has its own approach to how the laser beams are sent out and returned and for how far and how often etc., etc., etc.

Supplier Continental has been working with AEye for the past 10 months and has now announced that it will be using the LiDAR system in its suite of hardware and software for autonomous driving systems (Level 2+ to Level 4).

In industry parlance this is known as a “stack.”

So the AEye LiDAR hardware and software will be part of the Continental stack, which will also include cameras and radar.

The rule in developing autonomous driving capability is that the more sensors the better.

Or, as Frank Petznick, head of Continental’s ADAS Business Unit, put it, “Reliable and safe automated and autonomous driving functions will not be feasible without bringing the strengths of all sensor technologies together.”

Let’s face it: If you’re in a vehicle that is driving itself, you want as many sensors and computers as possible to keep you safe when going from A to B.

Continental anticipates that it will start producing the AEye long-range LiDAR systems in production volumes by 2024.

One more interesting thing about this. While the AEye system is long range (1,000 meters), Continental has a solid-state short-range high resolution 3D flash LiDAR that it will be bringing to the market later this year.

When you hear a certain individual talking about how autonomous driving can be done with a single type of sensor and a smart processor, realize that companies like Continental aren’t creating stacks because they just like complexity.

They don’t.

They do like to assure safety.

Why Your iPhone Isn’t Like Your Car

The rumored Apple car notwithstanding, there are evident efforts being taken by OEMs to make their user interfaces large, icon-intensive and swipeable, just like a smartphone. Since Tesla rolled out with the 17-inch screen in the Model S, there has been an on-going effort to make screens big and familiar, perhaps with the most exaggerated example the 56-inch Mercedes MBUX Hyperscreen.

Whereas it was once said that a “car is a computer on wheels,” it seems to have transitioned to a “car is a smartphone on wheels.”

After all, OEMs are not only aggressively integrating apps that they can mine for data and otherwise monetize, but the ability to have over-the-air updates (OTAs) is becoming as de rigueur as standard Android Auto and Apple CarPlay.

Tamara Snow of Continental. The complexity of vehicle compute, control and communications networks shouldn’t be underestimated. (Image: Continental)

But Tamara Snow, head of Research and Advanced Engineering, North America, Continental Automotive, points out that there are some substantial differences between that high-powered processing and communications device you carry around and that high-powered processing and transportation device you have carry you around when it comes to the compute architecture and application.

Snow notes that a smartphone has:

  • 1 microprocessor
  • 1 display
  • 1 operating system
  • 7 sensors
  • 6.1-ounce mass

And the smartphone has a top speed of 0 mph.

A vehicle has:

  • 100 microcontrollers
  • 4 displays
  • 4 operating systems
  • 100s of sensors
  • 2.5-ton mass

The vehicle has a top speed of 155 mph.

Snow says that when it comes to a smartphone, a software glitch can be “annoying.”

But the same for a motor vehicle can be “fatal.”

Making cars–and the systems that go into them–is hard.–gsv