Conti Goes Renewable for Tire Concept

Yes, even reused water bottles make the mixture

By Gary S. Vasilash

Tires are made of lots of materials. Yes, there is natural rubber. And synthetic rubber.

There are carbon black and silica.

There are cables, both metal and textile.

And there are various chemicals added for good measure.

Continental has developed what is says is a considerably more sustainable tire, one that has more than 50% of its materials being traceable, renewable and recycled.

There are lots of organic materials, including the natural rubber from dandelions (not necessarily the ones you have in your lawn, but similar), silicate from rice husks, and vegetable oils from, well, vegetables rather than petroleum products from prehistoric plant matter and sea creatures.

Conti’s clever “green” tire. (Image: Continental)

Thirty-five percent of the Conti GreenConcept tire (yes, this is still conceptual; you can’t get one—yet) consists of renewable raw materials.

Then there are recycled materials, which account for about 17% of the tire. Things like the polyester recovered from PET bottles—bottles that are used for soda and water.

Another clever aspect of the tire is that it is lighter than a comparable conventional one. This helps lower the rolling resistance, and that means that less energy is necessary to turn the tires. This can mean as much of an improvement of 6% in the range of an electric vehicle.

(It knows a little more than a little about EVs as it has its tires on EVs from companies ranging from Audi to Vinfast—and, yes, Tesla.)

According to Continental CEO Nikolai Setzer, “Continental will completely convert its global tire production to the use of sustainable materials by 2050 at the latest.”

While that might seem like a long time, they’ve been making tires for some 150 years, so it is relative.

What Does a LiDAR System “See”?

“LiDAR” is “light detection and ranging.” It is a key element to advanced driver assistance systems (ADAS) as well as autonomous driving systems

By Gary S. Vasilash

In case you ever wondered about what a LiDAR system might “see,” here is a picture of a LiDAR point cloud that includes HD resolution and long-distance range.

AEye LiDAR image. (Image: Continental)

It is worth noting that this image, from LiDAR developer AEye, isn’t necessarily the output of all systems.

Each LiDAR system developer essentially has its own approach to how the laser beams are sent out and returned and for how far and how often etc., etc., etc.

Supplier Continental has been working with AEye for the past 10 months and has now announced that it will be using the LiDAR system in its suite of hardware and software for autonomous driving systems (Level 2+ to Level 4).

In industry parlance this is known as a “stack.”

So the AEye LiDAR hardware and software will be part of the Continental stack, which will also include cameras and radar.

The rule in developing autonomous driving capability is that the more sensors the better.

Or, as Frank Petznick, head of Continental’s ADAS Business Unit, put it, “Reliable and safe automated and autonomous driving functions will not be feasible without bringing the strengths of all sensor technologies together.”

Let’s face it: If you’re in a vehicle that is driving itself, you want as many sensors and computers as possible to keep you safe when going from A to B.

Continental anticipates that it will start producing the AEye long-range LiDAR systems in production volumes by 2024.

One more interesting thing about this. While the AEye system is long range (1,000 meters), Continental has a solid-state short-range high resolution 3D flash LiDAR that it will be bringing to the market later this year.

When you hear a certain individual talking about how autonomous driving can be done with a single type of sensor and a smart processor, realize that companies like Continental aren’t creating stacks because they just like complexity.

They don’t.

They do like to assure safety.

Why Your iPhone Isn’t Like Your Car

The rumored Apple car notwithstanding, there are evident efforts being taken by OEMs to make their user interfaces large, icon-intensive and swipeable, just like a smartphone. Since Tesla rolled out with the 17-inch screen in the Model S, there has been an on-going effort to make screens big and familiar, perhaps with the most exaggerated example the 56-inch Mercedes MBUX Hyperscreen.

Whereas it was once said that a “car is a computer on wheels,” it seems to have transitioned to a “car is a smartphone on wheels.”

After all, OEMs are not only aggressively integrating apps that they can mine for data and otherwise monetize, but the ability to have over-the-air updates (OTAs) is becoming as de rigueur as standard Android Auto and Apple CarPlay.

Tamara Snow of Continental. The complexity of vehicle compute, control and communications networks shouldn’t be underestimated. (Image: Continental)

But Tamara Snow, head of Research and Advanced Engineering, North America, Continental Automotive, points out that there are some substantial differences between that high-powered processing and communications device you carry around and that high-powered processing and transportation device you have carry you around when it comes to the compute architecture and application.

Snow notes that a smartphone has:

  • 1 microprocessor
  • 1 display
  • 1 operating system
  • 7 sensors
  • 6.1-ounce mass

And the smartphone has a top speed of 0 mph.

A vehicle has:

  • 100 microcontrollers
  • 4 displays
  • 4 operating systems
  • 100s of sensors
  • 2.5-ton mass

The vehicle has a top speed of 155 mph.

Snow says that when it comes to a smartphone, a software glitch can be “annoying.”

But the same for a motor vehicle can be “fatal.”

Making cars–and the systems that go into them–is hard.–gsv