As ADAS technology extends to critical, time-sensitive applications – such as emergency braking, front-collision warning and avoidance, and blind-spot detection – combining data from multiple sensors enables reliable, real-time decisions for safer autonomous driving
From reading road signs to keeping you inside lane markers, artificial-intelligence-assisted cameras are already making our vehicles smarter and safer. But what happens when the fog rolls in and your camera’s vision is as compromised as yours?
“A camera might be great for object recognition, but it’s not so good in bad weather or at night,” said Miro Adzan, general manager of advanced driver assistance systems (ADAS) at our company. “However, radar will continue to work in rain, snow or mist. Driver assistance systems need to incorporate a range of different sensors so the vehicle can take full advantage of the benefits of these different technologies.”
Using the strengths of different types of sensors is not just a matter of switching between them for different conditions or applications. Even in clear weather, a camera will be stronger for object details, but radar will measure an object’s distance more accurately.
s these systems extend to critical and time-sensitive applications – such as emergency braking, automatic parking, front-collision warning and avoidance, and blind spot detection – design engineers will need to fuse these different information sources into a single picture to deliver reliable real-time decisions.
“For automatic parking, you need to combine data from cameras, radar and sometimes ultrasound to give the vehicle an accurate sense of what’s going on around you,” said Curt Moore, general manager for Jacinto™ processors at our company. “None of these sensors would be accurate enough on their own, but by combining them, you can get a much more accurate picture of the space around you. This allows you to park in much tighter spaces without the risk of causing damage.”
Advanced safety systems are no longer reserved only for high-end automobiles. Nearly 93% of vehicles produced in the U.S. come with at least one ADAS feature – and automatic emergency braking is set to become standard across 99% of new cars in the United States by September.1
The shift is a result of the decreasing cost and size of sensors, such as TI mmWave radar sensors which integrate an entire radar system into a chip the size of a coin.
“Ten years ago, radar was predominantly used in military applications because of size, cost and complexity,” Miro said. “But today, radar is on the verge of becoming a standard component in the car.”
While the proliferation of affordable sensors opens up new applications, it also creates new challenges for ADAS engineers who need to design systems that bring together all the data streams and process them efficiently, while meeting tight affordability and power constraints.
In a single-sensor ADAS system, pre-processing data for object detection takes place close to the sensor in order to use that information immediately. But sensor fusion requires that raw, high-resolution data be instantly transmitted to a central unit for processing to form a single, accurate model of the environment that will help the vehicle avoid a collision.
“With all the data coming in from these sensor nodes, the challenge is making sure all of it is synchronized so the vehicle can understand what’s happening around you and make critical decisions,” said Heather Babcock, general manager for FPD-Link™ products at our company. “In order to transmit synchronized data in real time, it’s important to have high-bandwidth, uncompressed transmission capability because compressing data introduces latencies.”
Our FPD-Link communications protocol, which was initially created for transmitting digital video streams from graphics processors to digital displays, is designed for transmitting large amounts of uncompressed data over several meters with simple, easily routable cables.
“You have a standard protocol on one end that is converted into a serial FPD-Link stream, which is an extremely secure and robust proprietary encoding,” Heather said. “That’s matched to a paired deserializer on the other end that reconstructs the data into its original format and transmits it over various interface protocols that TI’s product portfolio supports.”
Once this data is at the central processor, integrating it into a unified model of the car’s surroundings typically requires computationally intensive signal processing and deep-learning algorithms – with a consequent increase in required power input and heat output.
The physical constraints of an automobile place tight limits on the size and weight of batteries and cooling infrastructure, so ADAS engineers need processors specifically designed to perform these tasks as efficiently as possible.
Our Jacinto processors combine dedicated digital signal processing (DSP) and matrix multiplication cores that operate with the lowest available power in the industry, even at temperatures of up to 125 degrees Celsius.
“There are tremendous advantages in integrating the DSP and the processor into one system on a chip,” Curt said. “Otherwise, each will need its own memory and power supply, driving up the system cost. The other advantage is the reduction in latency gained by integrating these operations into one chip.”
In addition to power-efficient processors, our automotive-qualified power management integrated circuits with functional safety features for sensor fusion, front cameras and domain controllers improve overall power efficiency and functionality within the vehicle.
Beyond the individual components, our entire ecosystem of ADAS products is created for seamless compatibility, allowing car manufacturers to select from a holistic portfolio that can be scaled to the demands and price points of their vehicles.
"We have all the pieces of the ADAS puzzle designed in a way that keeps the various challenges of the vehicle in mind," Miro said. "That makes the system design easier for our customers."
Information from https://news.ti.com/
Contact Person: sales