GM Super Cruise: To the Moon!

By Gary S. Vasilash

As you’ve probably noticed, when you look at the Moon overhead, it generally appears to be about the size of a dime. When you look at it near the horizon, it appears massive. Which might lead you to believe that the Moon radically changes its distance with the Earth.

It doesn’t.

The Moon is 238,855 miles from the Earth whether it appears so close that you think you could drive to it or not.

Walking on the Moon. (Image: NASA)

This astronomical moment leads to General Motors’ announcement that it has increased the capability of its Super Cruise advanced driver assistance system that permits hands-free driving to 750,000 miles of roadways in the U.S. and Canada.

It claims that it is nearly six times the capability of any other hands-free driver assistance system available in North America.

It also says, “750,000 miles is like traveling one way from Earth to the Moon three times” [emphasis not added, its GM’s].

GM is adding the capability to handle the new roads via over-the-air updates to Super Cruise vehicles that are currently roaming the road except for three: the Cadillac CT6, Chevrolet Bolt EUV and Cadillac XT6.

Why are those three left out?

A GM spokesperson tells us that it is because those three vehicles have an older electrical system that’s incapable of handling the update.

Still, they are capable of providing some 400,000 miles of roads, so that’s a trip to the Moon and about 70% of the way back.

Not bad at all.

Self-Driving Is Desirable. So Is Cake Without Calories.

By Gary S Vasilash

The “Consumer Attitude’s Around Autonomous Vehicle Technology Survey” indicates that there is a solid base of consumers who are ready to spend money to buy self-driving capability for their next vehicle.

Perhaps.

That is, there is a blur between advanced driver assistance systems (ADAS) and autonomous driving, even though the survey conducted for Ghost Autonomy, a developer of autonomous driving software, provides definitions of both, with ADAS including “automatic emergency braking, blind spot and pedestrian detection, lane keeping assist, surround view, parking assist, driver drowsiness detection and gaze detection” and autonomous driving technologies based on the SAE five levels, but claiming “L3-L5 is considered fully autonomous driving that does not require human backup,” which is not the case, because L3, while it lets the driver do other things, also requires that the driver be capable of reassuming, well, driving.

BMW Pesrsonal Pilot L3: yes, a driver is still required to regain control when needed. (Image: BMW)

For example, BMW has launched “BMW Personal Pilot L3,” which will be available to purchasers (adding 6,000 euros to the sticker) of the BMW 7 Series—in Germany only.

According to BMW this system provides “Level 3 capability as defined by the Society of Automotive Engineers,” and it “allows drivers to redirect their focus to other in-vehicle activities when travelling at up to 60 km/h (37 mph) on motorways with structurally separated carriageways.”

However, the driver “still has to be ready to reassume the task of driving at any time – i.e. as soon as the situation on the road requires them to or the stretch of road suitable for using the BMW Personal Pilot L3 comes to an end.”

In other words, “human backup.”

According to the Ghost Autonomy survey, 52% of those who have experienced self-driving (which arguably would be those who have ridden in a Cruise or Waymo vehicle, as FSD’s name notwithstanding, Tesla’s product isn’t self-driving, at least not within the existing classification, and it actually requires that the driver keep hands on the wheel) would “consider buying a car with full autonomy sooner if the technology was available today,” which is sort of a moot point because (a) it isn’t and (b) its not likely to be anytime soon.

What’s more, those “drivers who’ve experienced self-driving,” 78% of them, are willing to pay $5,000 or more upfront. Arguably this will be a lot more than $5,000 because the aforementioned BMW system would be about $6,600, and the Tesla FSD package (“Your car will be able to drive itself almost anywhere with minimal driver intervention and will continuously improve”) adds $12,000 to the sticker.

One finding in the survey that is certainly laudable is that when asked to rank the factors considered when they purchase their next vehicle they are:

  1. ADAS
  2. Keyless or phone-based entry and start
  3. Premium infotainment screen and sound
  4. Premium interior/exterior trim
  5. EV/battery powertrain

Yes, safety systems rank first.

But one wonders whether that answer isn’t analogous to what people say at the dentist office when asked about their brushing and flossing habits.

Who is going to say even on a survey that awesome audio is more important to them than safety?

Do People Have to Walk Really Fast?

Owl Autonomous Imaging has developed and patented “monocular 3D thermal imaging and ranging solutions.”

The company’s tech is applicable to automotive advanced driver assistance systems (ADAS).

According to Owl, current ADAS systems generally use “mutually dependent visible-light cameras and radar” that can be negatively affected under certain conditions. Like night and/or rain.

So the Owl system is meant to overcome those limitations by using HD thermal imaging and the appropriate computer vision algorithms that are capable of creating “ultra-dense point clouds and highly refined object classification.”

In other words, objects like people and other animals generate heat and this system is able of detecting it.

Here’s the curious thing: the tech that Owl is developing for automotive use is predicated on “a thermal ranging solution developed under a challenge grant from the US Air Force to track missiles in flight traveling at over 1,000 mph.”

Lucid Describes Robust Sensor Suite

Although the folks at Lucid Group probably don’t think about Elon all that often. . .

By Gary S. Vasilash

Lucid Group, which is producing its Lucid Air electric vehicles in its brand-new plant in Casa Grande, AZ, put Tesla in second place in the range department as it got a 520-mile range rating from the EPA, and the Model S Long Range is 412 miles.

(To be sure, 412 miles is nothing to sniff at, as it is the sort of thing that most OEMs would give up an engine plant to achieve.)

And now there is another numeric–and arguably functional–difference.

Elon Musk is famously sensor thrifty, as Tesla models dependi on cameras and ultrasonic sensors (it had been using radar, but evidently that went away earlier this year). Which make the nomenclature “Full Self-Driving” and “Autopilot” all the more troubling for those who actually think about the implications of those names.

Lucid announced the details of its “DreamDrive” advanced driver assistance systems, the base and Pro versions (Pro is standard on Lucid Air Dream Edition and Lucid Air Grand Touring, so the “dream” in the name goes to the model, not some sort of suggestion that one can sleep behind the wheel).

Lucid DreamDrive sensor suite provides a comprehensive scope. (Image: Lucid)

The system can utilize as many as 32 sensors, including 14 visible-light cameras, five radar units, four surround view cameras, ultrasonic sensors throughout the vehicle exterior, and, for DreamDrive Pro, solid-state lidar.

Of course, sensors are only part of an ADAS system. Processing capability is essential.

Lucid is using its proprietary “Ethernet Ring” system, which is a high-speed data network for four computer gateways to communicate at gigabit speeds so that the processors can assure that the sensor input gets translated into the steering, braking and accelerating functions as required.

When it comes to driver assistance, the more support—and sensors—the better.

Tesla’s approach notwithstanding.

Lidar Explained

You’ve probably heard reference to “lidar.” Here’s where you can get a quick tutorial

By Gary S. Vasilash

Elon Musk once famously said, “Lidar is a fool’s errand.”

And it went downhill from there.

What was he talking about?

A sensor that uses laser beams.

The sensor sends out pulsed light waves from as many as 128 individual lasers (at an eye-safe frequency, so you need not worry about being blinded by a vehicle coming at you with lidar engaged). The waves hit something and bounce back. The time is calculated (send, hit, return). And the information is used to generate a 3D map of the environment. Realize that there is a lot going on here: this beam bouncing is taking place at a rate of millions of times per second.

Using lasers for sensing. (Image: Velodyne Lidar)

The whole purpose of this is to enhance a vehicle’s ability to be able to provide safer driving—for the people within the vehicle as well as others, be they in other vehicles or on foot. And it can also contribute to self-driving vehicles, with the sensor or sensors (there are some lidar devices that have a 360° view so conceivably only one would be needed on the roof of a vehicle to “see” what’s going on; there are some devices that have more limited view, say 120°, so there would be multiples installed) providing input so that the vehicle can perform accordingly.

3D lidar was invented by David Hall in 2005. He had established a company in 1983 to produce audio subwoofers. What was then Velodyne Acoustics has become Velodyne Lidar.

And on this edition of “Autoline After Hours” Mircea Gradu, Velodyne senior vice president of Product and Quality, provides an explanation of lidar—the how, why, where and when of the technology.

One of the things that he really emphasizes in his comments is the importance of lidar when it comes to safety.

He points out, for example, that most vehicle-pedestrian accidents occur after dark. In 2018 76% of pedestrian crash fatalities in the U.S. occurred at night.

Lidar can “see” in the dark. Camera-radar based system don’t have the same level of capabilities. So so far as Velodyne is concerned, any advanced driver assistance system (ADAS) really needs to have lidar sensors as part of its sensing suite. Assuming that the vehicles are going to travel at night.

While Gradu is, not surprisingly, a bit proponent of lidar, he also acknowledges that there needs to be sensor fusion–the use more than just one or two types of sensors. After all, the subject is safety, and who wants to stint?

Gradu talks with Alexa St. John of Automotive News, “Autoline’s” John McElroy and me.

Then during the second half of the show the three of us discuss a number of topics, including the semiconductor shortage and potential solutions, whether companies like GM are putting billions of dollars at risk when they invest heavily in electric vehicles and more.

And you can watch the show right here.