Do People Have to Walk Really Fast?

Owl Autonomous Imaging has developed and patented “monocular 3D thermal imaging and ranging solutions.”

The company’s tech is applicable to automotive advanced driver assistance systems (ADAS).

According to Owl, current ADAS systems generally use “mutually dependent visible-light cameras and radar” that can be negatively affected under certain conditions. Like night and/or rain.

So the Owl system is meant to overcome those limitations by using HD thermal imaging and the appropriate computer vision algorithms that are capable of creating “ultra-dense point clouds and highly refined object classification.”

In other words, objects like people and other animals generate heat and this system is able of detecting it.

Here’s the curious thing: the tech that Owl is developing for automotive use is predicated on “a thermal ranging solution developed under a challenge grant from the US Air Force to track missiles in flight traveling at over 1,000 mph.”

Lucid Describes Robust Sensor Suite

Although the folks at Lucid Group probably don’t think about Elon all that often. . .

By Gary S. Vasilash

Lucid Group, which is producing its Lucid Air electric vehicles in its brand-new plant in Casa Grande, AZ, put Tesla in second place in the range department as it got a 520-mile range rating from the EPA, and the Model S Long Range is 412 miles.

(To be sure, 412 miles is nothing to sniff at, as it is the sort of thing that most OEMs would give up an engine plant to achieve.)

And now there is another numeric–and arguably functional–difference.

Elon Musk is famously sensor thrifty, as Tesla models dependi on cameras and ultrasonic sensors (it had been using radar, but evidently that went away earlier this year). Which make the nomenclature “Full Self-Driving” and “Autopilot” all the more troubling for those who actually think about the implications of those names.

Lucid announced the details of its “DreamDrive” advanced driver assistance systems, the base and Pro versions (Pro is standard on Lucid Air Dream Edition and Lucid Air Grand Touring, so the “dream” in the name goes to the model, not some sort of suggestion that one can sleep behind the wheel).

Lucid DreamDrive sensor suite provides a comprehensive scope. (Image: Lucid)

The system can utilize as many as 32 sensors, including 14 visible-light cameras, five radar units, four surround view cameras, ultrasonic sensors throughout the vehicle exterior, and, for DreamDrive Pro, solid-state lidar.

Of course, sensors are only part of an ADAS system. Processing capability is essential.

Lucid is using its proprietary “Ethernet Ring” system, which is a high-speed data network for four computer gateways to communicate at gigabit speeds so that the processors can assure that the sensor input gets translated into the steering, braking and accelerating functions as required.

When it comes to driver assistance, the more support—and sensors—the better.

Tesla’s approach notwithstanding.

Lidar Explained

You’ve probably heard reference to “lidar.” Here’s where you can get a quick tutorial

By Gary S. Vasilash

Elon Musk once famously said, “Lidar is a fool’s errand.”

And it went downhill from there.

What was he talking about?

A sensor that uses laser beams.

The sensor sends out pulsed light waves from as many as 128 individual lasers (at an eye-safe frequency, so you need not worry about being blinded by a vehicle coming at you with lidar engaged). The waves hit something and bounce back. The time is calculated (send, hit, return). And the information is used to generate a 3D map of the environment. Realize that there is a lot going on here: this beam bouncing is taking place at a rate of millions of times per second.

Using lasers for sensing. (Image: Velodyne Lidar)

The whole purpose of this is to enhance a vehicle’s ability to be able to provide safer driving—for the people within the vehicle as well as others, be they in other vehicles or on foot. And it can also contribute to self-driving vehicles, with the sensor or sensors (there are some lidar devices that have a 360° view so conceivably only one would be needed on the roof of a vehicle to “see” what’s going on; there are some devices that have more limited view, say 120°, so there would be multiples installed) providing input so that the vehicle can perform accordingly.

3D lidar was invented by David Hall in 2005. He had established a company in 1983 to produce audio subwoofers. What was then Velodyne Acoustics has become Velodyne Lidar.

And on this edition of “Autoline After Hours” Mircea Gradu, Velodyne senior vice president of Product and Quality, provides an explanation of lidar—the how, why, where and when of the technology.

One of the things that he really emphasizes in his comments is the importance of lidar when it comes to safety.

He points out, for example, that most vehicle-pedestrian accidents occur after dark. In 2018 76% of pedestrian crash fatalities in the U.S. occurred at night.

Lidar can “see” in the dark. Camera-radar based system don’t have the same level of capabilities. So so far as Velodyne is concerned, any advanced driver assistance system (ADAS) really needs to have lidar sensors as part of its sensing suite. Assuming that the vehicles are going to travel at night.

While Gradu is, not surprisingly, a bit proponent of lidar, he also acknowledges that there needs to be sensor fusion–the use more than just one or two types of sensors. After all, the subject is safety, and who wants to stint?

Gradu talks with Alexa St. John of Automotive News, “Autoline’s” John McElroy and me.

Then during the second half of the show the three of us discuss a number of topics, including the semiconductor shortage and potential solutions, whether companies like GM are putting billions of dollars at risk when they invest heavily in electric vehicles and more.

And you can watch the show right here.