The Trials of Remodeling

If you had an NVIDIA-powered system you could pull off that basement remodel without a hitch. . .

By Gary S. Vasilash

If you are, say, redoing your basement, you might think you’ve got everything planned out to the final light fixture but discover along the way that there happens to be something that isn’t going to allow it to happen as anticipated, such as a support pole being in the “wrong” place. (It, of course, is in the right place. Your plans are off.)

You might think that this is something that couldn’t happen during professional projects.

Like when modifying an existing factory to accommodate a new vehicle or to add capacity.

Turns out, factories can be just like basements.

While half of those robots are where they need to be, the question is whether the other half will be able to do what needs to be done. So simulation lets BMW engineers know. (Image: BMW)

Only the consequences can be greater when it turns out the support beam is the way.

BMW plans to launch more than 40 new or updated vehicles between now and 2027.

It has more than 30 production sites to prepare.

To do this with as minimal a hitch as possible it is using its “Virtual Factory.”

That’s a simulation system that’s based on the NVIDIA Omniverse.

Inputs to the simulation include everything from building data to vehicle metrics, equipment information to manual work operations.

Simulations are run in real time.

Potential collisions (e.g., banging into a column) are automatically determined.

What’s surprising is that pre- this digital twin approach it was sometimes necessary to manually move a vehicle through the plant to make sure everything fit.

And in some cases it was necessary to drain the dip tanks in the paint shop, which is not only time-consuming, but expensive.

And speaking of costs: BMW says the Virtual Factory approach will save as much as 30% in production planning.

Developing Autonomy Digitally

MITRE, Mcity & NVIDIA. . .

By Gary S. Vasilash

When the subject is NVIDIA and the auto industry, it is often related to how companies like Volvo are leveraging the company’s silicon for getting autonomous capabilities.

But what is possibly more germane and interesting on the road to autonomous driving is an announcement by MITRE Corporation, a government-sponsored nonprofit research firm, that it is partnering with Mcity, a 32-acre site in southeastern Michigan operated by the University of Michigan where autonomous driving systems can be safely developed.

Mcity includes various road surfaces, road signs, building facades, various types of crossings, underpasses, guard rails, and other real-world elements of driving.

NVIDIA’s Orin. Not only does it develop tech for autonomy in vehicles, it also has tech that allows the simulations necessary to get autonomous capabilities developed. (Image: NVIDIA)

What MITRE and Mcity will be doing is using NVIDIA Omniverse Cloud Sensor TRX APIs.

Or said more simply: developing the means by which their will be a comprehensive digital representation of the entire Mcity environment, including the cameras, lidars, radars, and ultrasonic sensors on vehicles.

The objective of having things like a digital twin of Mcity is providing the means by simulations can be run to test the performance of virtual vehicles. Then this information can be used in the development of physical vehicles.

By running the tests first in the virtual environment there can be repeated consistent tests as well as the ability to adjust parameters to present different conditions and/or capabilities so that when the real car is on the roadways of Mcity it is more likely to be able to perform as expected due to the virtual testing.

Sure, in order to get autonomous capabilities vehicles are going to need to use things like the NVIDIA Orin processor. But to get to the vehicles, there needs to be a whole lot of development, which things like the NVIDIA Omniverse Cloud Sensor TRX APIs can be instrumental for.

A Consideration Regarding NVIDIA GTC

By Gary S. Vasilash

Quick quiz.

What’s missing from these lists:

  • BYD
  • Xpeng
  • Hyper (a luxury brand owned by GAC AION)

Those OEMs will al be using the in-vehicle computing platform that’s architected for generative AI applications, NVIDIA DRIVE Thor.

And there are:

  • NIO
  • Geely

NIO is using NVIDIA AI stacks for its in-cabin capabilities including Cabin Atmosphere Master and Vehicle Assistant.

Geely is using NVIDIA TensorRT-LLM , generative AI and large language model (LLM) tech, for personalized cabin experiences.

These were announced at the NVIDIA GTC conference this week in Silicon Valley.

What’s missing? Companies like Ford and GM.

It should be noted that other companies, including BMW and Mercedes, are using NVIDIA tech for their vehicles.

And that Danny Shapiro, NVIDIA vice president, Automotive, points out that non-listed OEMs use NVIDIA GPUs in their data centers.

NVIDIA DRIVE Thor (Image: NVIDIA)

DRIVE Thor, which will make its way into vehicles by next year, provides 1,000 teraflops of performance (with a teraflop being one trillion floating operations per second, and while that may not be meaningful in and of itself, clearly that’s a whole lot of processing capability).

To be fair, NVIDIA isn’t the only game in town, with competitors including Intel, AMD, Qualcomm, and others. They may be providing the silicon to the OEMs not on the lists.

But it surely seems to be the case that as NVIDIA holds its massive GTC global technology event in San Jose this week there would be at least some announcement of automotive companies that aren’t based in China using the company’s processors.

While there is concern in the U.S. about low-cost Chinese EVs threatening the U.S. market at some point, there ought to be similar concern with highly capable Chinese EVs that are offering all manner of AI-enhanced features and functions doing the same.

NVIDIA, incidentally, is headquartered in Santa Clara, California, so it is not like U.S.-based OEMs would have to travel far to pay it a visit.

Talking Tech With NVIDIA

By Gary S. Vasilash

NVIDIA is a company that was once familiar primarily to gamers because of the GPU chips that it had developed that made rendering both fast and highly detailed.

Now NVIDIA is as familiar to those in the auto world, as it is working with Jaguar Land Rover, Mercedes, Volvo and more.

Lucid Motors is using NVIDIA tech in its Air. BYD has announced it is working with the company, as well.

NVIDIA developing maps for autonomous driving operations. (Image: NVIDIA)

What’s interesting is that these companies are using NVIDIA tech to build systems that provide the characteristics that they are looking for to make their vehicles distinctive.

NVIDIA is not merely producing processors that have massive processing capability—the Jetson Orion operates at up to 275 TOPS—that’s trillion operations per second—but it is developing software that will help facilitate autonomous driving operations.

The company has developed a mapping system that not only features information collected by specific vehicles, but which takes in crowdsourced information so that there is an accurate representation of what is going on: say a construction zone has popped up since that information was collected. The system has it.

On this edition of “Autoline After Hours” NVIDIA vice president of Automotive Danny Shapiro discusses what the company is doing and how it is doing it.

Arguably NVIDIA is at the forefront of developing the technology that will change transportation in many ways.

He talks with “Autoline’s” John McElroy, Joe White of Reuters and me.

And during the second half of the show McElroy, White and I discuss a variety of topics, including the opening of the Tesla plant in Berlin, the speculation that Porsche might build the long-rumored Apple car, the announced range of the Ford F-150 Lighting, and a variety of other subjects.

And you can see it all here.

Volvo: The Return to Safety

Back to a core value

By Gary S. Vasilash

Volvo cars were once widely known for two characteristics:

  1. Their boxy design
  2. The fact that they were built with safety foremost

The company essentially “owned” safety in the minds of consumers.

But in the mid- to late-90s the company wanted to be more than something that was the Official Car Builder for Tweed-Jacket-With-Suede-Elbow-Patch-Wearing and Pipe-Smoking East Coast Professors.

Style took over from safety.

The design team members were evidently given French curves to supplement the T-squares.

And while the engineers back in Gothenburg were still figuring out the materials and the structures and the systems that would make the Swedish vehicles safe, their laudable efforts were eclipsed by things like Val Kilmer’s character driving a C70 in The Saint.

But safety is back.

In 2022 Volvo will launch a fully electric SUV, the flagship model for the brand.

(Image: Volvo)

It will come standard with a LiDAR system, from Luminar, and an on-board supercomputer system, from NVIDIA.

“Volvo Cars is and always has been a leader in safety. It will now define the next level of car safety,” said Håkan Samuelsson, Volvo chief executive.

When it comes to autonomous driving, the thing is that there is little in the way of driving and a whole lot in the way of trusting.

As in trusting that the system is going to work because you are, even though behind the wheel, acting as a passenger.

Safety is huge when it comes to autonomy. Which means a need for plenty of sensors, including LiDAR, and the wherewithal to process that information so that the system will have the appropriate responses (e.g., braking, turning, accelerating).

By coming out and saying that this tech is going to be built in to its new vehicle, it seems as though that Volvo is ready to take that safety mantle back.

(Kilmer? He’ll be back this fall as Iceman in Top Gun: Maverick)

BMW Virtual Art Car

No word on whether an NFT is involved

By Gary S. Vasilash

BMW has long been a leader in supporting artists through providing them with a highly visible canvas: a BMW vehicle. So there have been “BMW Art Cars” painted by John Baldessari, Alexander Calder, Jeff Koons, Andy Warhol, etc.

BMW has been doing this for 50 years.

The OEM has contracted with Nathan Shipley, director of creative technology at Goodby, Silverstein & Partners, and Gary Yeh, founder of artDrunk, to create “The Ultimate AI Masterpiece.”

The BMW “Ultimate AI Masterpiece,” based on an 8 Series. (Image: BMW)

On a simple level, they used a system based on an NVIDIA StyleGAN AI model to scan over 50,000 artworks over a 900-year period. They added in not only the artists who had done Art Cars, but also works from emerging artists.

And the result was projection mapped onto a BMW 8 Series Gran Coupe. Or at least a virtual rendition of one.

Said Shipley: “AI is an emerging medium of creative expression. It’s a fascinating space where art meets algorithm. Combining the historical works with the curated modern works and projecting the evolving images onto the 8 Series Gran Coupe serves a direct nod to BMW’s history of uniting automobiles, art, and technology.”

That said, somehow the actual artists doing work on actual cars seems like more of an execution of creative expression than running an algorithm.

Automating Big Rigs

There’s a lot of weight being hauled by one of those things. And a whole lot of processing for autonomy

Here’s something to think about the next time you’re rolling down the highway in your compact crossover:

One of those big rigs that is on the road with you can weigh 80,000 with a full trailer.

It doesn’t take a physicist to calculate that consequently stopping and maneuvering is going to require more time than the vehicle you’re in.

Plus system (look at the top of the cab) uses lidar, radar and cameras. (Image: Plus)

As drivers of those trucks tend to be on long-distance routes, developing autonomous driving capability for them is a growing area of interest.

One such company in this space is Plus, which is developing self-driving truck tech. According to Hao Zheng, CTO and co-founder of the company, they have more than 10,000 pre-orders for its system.

Here is a number from him that is even more astonishing than the aforementioned 80,000 and 10,000—even more than 80,000 times 10,000: “Enormous computing power is needed to process the trillions of operations that our autonomous driving system runs every fraction of a second.”

Trillions of operations every fraction of a second?

Plus has opted to develop its system using the NVIDIA Orin, which, according to NVIDIA, can deliver 254 trillion operations per second.

Evidently enough.

Still, driver or no, you’ve got to show those vehicles some respect.–gsv