Lincoln Approaches 100

Yet it is working to maintain freshness and relevance in the market by paying attention to the market

By Gary S. Vasilash

One of the aspects of vehicle ownership that probably doesn’t receive as much attention as it ought to is the act of ownership itself.

As in purchasing the vehicle. Then the on-going owning of the vehicle.

To be sure, the product itself has to be worth acquiring. Features, functions, capabilities and the like. Style and technology.

Lincoln, which is now predicated on a lineup of SUVs, is going to be launching an electric vehicle next year and will be offering a fully electrified lineup by 2030.

Lincoln is readying the launch of the Lincoln Intelligence System, a cloud-based system that provides extensive capabilities for its vehicles, including over-the-air updates.

Lincoln will soon be launching its Lincoln ActiveGlide hands-free driving system.

And there is more.

But one of the more interesting aspects of what Lincoln has been steadily doing is providing excellent customer—it calls its purchasers “clients”—service. According to a recent J.D. Power survey, Lincoln is number-one in sales satisfaction among luxury brands.

2022 Lincoln Navigator (Image: Lincoln)

It has developed what it calls the “Lincoln Way,” which is a customer-centric approach to the buying and ownership experience, which it is initially launching in China—an important market for the brand—and then will roll out in North America.

Michael Sprague is Lincoln’s North America Director, which means he is in charge of marketing, sales and service for the marque in the U.S., Canada and Mexico.

And on this edition of “Autoline After Hours” Sprague talks to “Autoline’s” John McElroy and me about what Lincoln is doing to help increase the momentum that it is building with not only vehicles like the Navigator, but with its approach to the customer both during and after the sale.

Sprague is one of the most thoughtful and articulate people in the industry, so his observations about the brand—which will be 100 next year (at least will have been part of Ford for 100 years, since it was founded by Henry M. Leland in 1917, and he sold it to Ford in 1922. . .and it is worth noting that Leland had earlier founded another company: Cadillac)—are worthwhile for those with interest in the industry.

In addition to which, McElroy and I talk with Patrick Lindemann, president, Transmission Systems, E-Mobility, Schaeffler, and John Waraniak, CEO, Have Blue, about the Indy Autonomous Challenge, which will be run at the Indianapolis Motor Speedway on October 23.

This race will pit 10 vehicles, all Dallara AV-21s, that have been engineered by student teams from around the world, in a race with $1-million going to the winning team.

No, it will not go to the winning driver, because as the name of the race indicates, there are no drivers, this is an autonomous event.

And you can see all of this right here.

How Autonomy Will Really Start

Why Ford, Argo AI and Walmart are going to be making a difference in the implementation of the tech

By Gary S. Vasilash

Although many people think—or imagine—that autonomous driving is going to occur from a company like Tesla, which will allow people to do whatever while their vehicle chauffeurs them to wherever, in point of fact, that is not going to be the case for a variety of reasons, not the least of which that sensors and processors are expensive, and even though there are some people who are willing to pay an exorbitant amount of money for something that claims to be “full” but is really more than half empty, OEMs are going to need to have assurance that there are going to be many more than a few who are willing to buy the tech.

But while consumers might not opt to spend the money, commercial carriers are likely to if they can determine that the tech is going to provide them with an economic advantage.

Ford, Argo AI and Walmart are driving autonomous tech forward. (Image: Ford)

Which makes the announcement by Ford, Argo AI and Walmart about the retailer using vehicles from Ford (Escape Hybrids) and self-driving technology from Argo AI to launch an autonomous delivery service for the “last-mile” in Miami, Austin, and Washington, DC, all the more significant.

These are mass-manufactured vehicles that are going to be put work by the world’s largest retailer in urban settings doing driving that will conceivably provide an ROI to Walmart, if not immediately, then at some point in the future.

Tom Ward, senior vice president of last mile delivery at Walmart U.S., said, “This collaboration will further our mission to get product to the homes of our customers with unparalleled speed and ease, and in turn, will continue to pave the way for autonomous delivery.”

The way this will work is that the Walmart online ordering platform will send information to the Argo AI cloud-based infrastructure, which will then calculate the necessary scheduling and routing.

The point is that this is all predicated on business processes.

And that’s what is going to make actual autonomy a real thing long before something shows up in your driveway that will take you from somewhere to somewhere else while you sit in the back seat eating a hot dog and watching Netflix.

Know that this is something of a journey as Ford and Argo AI have been testing their tech on city streets since 2018, the same year that Ford and Walmart ran a test in Miami. It takes time, effort and consistency of purpose.

The technology needs to be developed, tested, validated and verified.

It is not the consequence of an over-the-air update that follows a tweet.

Aurora and Volvo

Commercial trucks on the way to autonomy

By Gary S. Vasilash

If you read the sentence, “Sally drove her HEMI-powered Charger down Woodward” you probably know that the vehicle, the Charger, has a massive engine under its hood, a HEMI. (The “Woodward” part is just for color: M1 is a street in greater Detroit upon which countless HEMIs traveling one-quarter-mile at a time.)

When you read the line “the Aurora-powered Volvo VNL” and know that the VNL is Volvo Trucks’ big rig, Class 8 truck, you might wonder what kind of powertrain the “Aurora” is.

Volvo VNL being equipped with the Aurora Driver. (Image: Aurora)

But of course it isn’t.

Rather, Aurora, as in the self-driving technology development company, has integrated its “Aurora Driver,” its sensor suite for Level 4 autonomous driving, into the Volvo VNL.

The truck is actually being powered—in the Sally sense—by the Volvo D13 Turbo Compound engine that can produce up to 455 hp.

As for the Aurora implementation, the company is working to assure that the Aurora hardware and software are fully integrated into the architecture of the VNL so that Volvo will be able to produce the L4-capable trucks in its plant in Dublin, Virginia.

Driving Done Remotely

Imagine being driven in an autonomous vehicle that’s being controlled by someone who is remote

By Gary S. Vasilash

Most companies that are developing autonomous driving technology for vehicles—companies like Waymo and Argo AI and Cruise—are doing so such that the autonomous vehicle is. . .autonomous.

The sensors and the processors and the actuators necessary to making a given vehicle drive without human input are all embedded in said vehicle.

Teleoperation in Berlin. (Image: Vay)

Sure, the vehicle may access the cloud every now and then for an update of some sort (e.g., perhaps for some information regarding location), but otherwise autonomous is as autonomous does.

But then there’s a company out of Berlin named Vay.

Vay’s approach to autonomy is different.

Vay has developed a “teledriving” system.

This means that there is a “teledriver.” Someone who is not in the vehicle but who is in control of the vehicle.

Think of it, perhaps, like an air traffic controller combined with someone who is playing some version of Forza.

Vay co-founder and CEO Thomas von der Ohe: “As our system does not rely on expensive 360-degree lidar sensors, and is therefore comparatively inexpensive, our way of rolling out driverless vehicles will not only enable consumers to experience driverless mobility sooner, but also provide a highly scalable solution that can be integrated into every car.”

It seems that the plan is learn from the teleoperation so that they will be able to roll out autonomous features gradually.

Vay has vehicles operating in Berlin right now, but there are safety drivers on board. The company believes that they will be able to operate fully teledriven next year.

What Does a LiDAR System “See”?

“LiDAR” is “light detection and ranging.” It is a key element to advanced driver assistance systems (ADAS) as well as autonomous driving systems

By Gary S. Vasilash

In case you ever wondered about what a LiDAR system might “see,” here is a picture of a LiDAR point cloud that includes HD resolution and long-distance range.

AEye LiDAR image. (Image: Continental)

It is worth noting that this image, from LiDAR developer AEye, isn’t necessarily the output of all systems.

Each LiDAR system developer essentially has its own approach to how the laser beams are sent out and returned and for how far and how often etc., etc., etc.

Supplier Continental has been working with AEye for the past 10 months and has now announced that it will be using the LiDAR system in its suite of hardware and software for autonomous driving systems (Level 2+ to Level 4).

In industry parlance this is known as a “stack.”

So the AEye LiDAR hardware and software will be part of the Continental stack, which will also include cameras and radar.

The rule in developing autonomous driving capability is that the more sensors the better.

Or, as Frank Petznick, head of Continental’s ADAS Business Unit, put it, “Reliable and safe automated and autonomous driving functions will not be feasible without bringing the strengths of all sensor technologies together.”

Let’s face it: If you’re in a vehicle that is driving itself, you want as many sensors and computers as possible to keep you safe when going from A to B.

Continental anticipates that it will start producing the AEye long-range LiDAR systems in production volumes by 2024.

One more interesting thing about this. While the AEye system is long range (1,000 meters), Continental has a solid-state short-range high resolution 3D flash LiDAR that it will be bringing to the market later this year.

When you hear a certain individual talking about how autonomous driving can be done with a single type of sensor and a smart processor, realize that companies like Continental aren’t creating stacks because they just like complexity.

They don’t.

They do like to assure safety.

Hyundai Launching RoboShuttle Pilot Program

Level 4 capability is rolling in the streets of Sejong Smart City. . .

By Gary S. Vasilash

Hyundai Motor Company—the parent organization of the firm that brings us Elantras and Tucsons and so on—announced it is launching a “RoboShuttle,” which is a Hyundai H350, light commercial, four-door van that is loaded with autonomous driving and artificial intelligence such that it will be able to transport people “with minimal intervention from a safety driver.”

Almost-autonomous vehicle. (Image: Hyundai)

More simply: this is going to be the basis of a ride-hailing service (there is the Shucle app developed by AIRS Company, which happens to be an AI research lab owned and operated by Hyundai). The system will determine the most-optimal routes based on demand.

The RoboShuttle will operate along a 6.1-km route—in Sejong Smart City, South Korea.

Think about that: “Sejong Smart City.”

It sounds exactly like the sort of place where there should be autonomous shuttles hailed by the Shucle app.

In the U.S. it is difficult to get the potholes filled in in many cities, and the South Koreans have established a city that is described as being smart.

Automating Big Rigs

There’s a lot of weight being hauled by one of those things. And a whole lot of processing for autonomy

Here’s something to think about the next time you’re rolling down the highway in your compact crossover:

One of those big rigs that is on the road with you can weigh 80,000 with a full trailer.

It doesn’t take a physicist to calculate that consequently stopping and maneuvering is going to require more time than the vehicle you’re in.

Plus system (look at the top of the cab) uses lidar, radar and cameras. (Image: Plus)

As drivers of those trucks tend to be on long-distance routes, developing autonomous driving capability for them is a growing area of interest.

One such company in this space is Plus, which is developing self-driving truck tech. According to Hao Zheng, CTO and co-founder of the company, they have more than 10,000 pre-orders for its system.

Here is a number from him that is even more astonishing than the aforementioned 80,000 and 10,000—even more than 80,000 times 10,000: “Enormous computing power is needed to process the trillions of operations that our autonomous driving system runs every fraction of a second.”

Trillions of operations every fraction of a second?

Plus has opted to develop its system using the NVIDIA Orin, which, according to NVIDIA, can deliver 254 trillion operations per second.

Evidently enough.

Still, driver or no, you’ve got to show those vehicles some respect.–gsv

Baidu, Geely and a Sensible Approach

Baidu is somewhat like Google, inasmuch as it operates a search engine, by far the leading search engine in China. But there are other services as well, including maps (Google), an encyclopedia (Wikipedia) and cloud storage (AWS).

So it is fair to simply describe it as a significant tech company.

Like other tech companies, it is expanding its operations. And so it should come as no surprise that it is moving into automotive.

But it isn’t like the company just discovered the space. It has been operating Baidu USA since 2011 and has been conducting autonomous driving operations in Silicon Valley for more than five years.

In 2017 Baidu announced Apollo, the autonomous driving platform that it garnered an array of partners to participate with in on the development, partners ranging from Intel to Toyota.

It is running an autonomous taxi service in a few cities in China.

Geely SEA electric vehicle platform: EVs for everyone! (Image: Geely)

Geely Holding–parent company of brands including Volvo Cars, Lynk & Co and LEVC, and lead shareholder in Geely Auto, Proton and Lotus—and Baidu have announced the creation of a partnership for the development of highly automated electric vehicles.

Geely is going to be providing the platform—the Sustainable Experience Architecture, which it announced in September 2020 as an “open source” electric vehicle platform that it would offer to other global OEMs—and Baidu the digital horsepower.

Manufacturing vehicles is a different kind of hard than the challenges associated with developing AI systems.

It makes absolute sense that a digital company would partner with a hardware manufacturer—in this case, the hardware being a vehicle, not a smartphone.

In a market where there are some 21-million passenger vehicles sold per year, where there is a comparatively low penetration rate of vehicle ownership (on the order of 173 vehicles per 1,000 people, compared with 837 in the U.S.), even a small slice of the market is still damned large.

And neither Geely nor Baidu seems to be focused on the small.–gsv