How sensor fusion impacts the automotive ecosystem

How sensor fusion impacts the automotive ecosystem

Market news |
By Rich Pell

The automotive ecosystem will soon be driven by high-performance solutions for automated driving, finds IHS senior analyst Akilesh Kona. The reason is that in order to enable a vehicle to automate driving functions it needs reliable information about its surroundings. The need for reliable information however translates into the need for different, redundant sensor types.

Typically, Advanced Driver Assistance Systems (ADAS) utilize a dynamic 360-degree life image of its surroundings. This image is computed from data of different sensors – cameras, radar, lidar and, to a lesser extend, ultrasonic sensors. The reason why it is necessary to use different sensors is that each sensor type has its specific limitations. Cameras offer poor images under low-light and unfavorable weather conditions.

Radar sensors are affected much less by weather conditions; however, they achieve a relatively poor image quality with low resolution. Lidar sensors offer much better defined images, but they also perform less than stellar when they encounter rain, snow or hail. Blending all these signals together achieves reliable, high-definition images, redundant enough to be used in safety-critical applications such as automated driving.

This situation is favorable for startup companies to contribute their specific expertise. And since the established players wish to remain competitive, they have started to acquire the desired expertise through takeovers, notes IHS. Examples are General Motors having acquired self-driving technology company Cruise Automation, Delphi having taken over Carnegie-Mellon spin-off Ottomatika, and Dura Automotive Systems is collaborating with Green Hills Software to develop sensor fusion modules for automated driving.

Also the semiconductor industry is striving to provide the high-performance computing solutions necessary for the demanding task of sensor fusion. Examples are NXPs BlueBox or Mobileye’s EyeQx platforms. In particular its latest iteration EyeQ5. To some extend, chip vendors can revert to their expertise generated in similar designs for consumer markets.

Deep learning techniques and machine vision are regarded as a good way to solve the problems of decision-finding for automated driving. However, these technologies are new to the automotive industry. Expertise in these fields can be found in research institutions and universities which stimulates the launch of start-ups and spin-offs. These companies – if not acquired by established players – are creating a new dimension in the supply chain, as IHS expert Akilesh Kona puts it.

20% CAGR for the next ten decade:
The marekt for sensor fusion modules is taking off

The bottom line: The demand for sensor fusion algorithms and platforms creates a new and growing market within the automotive value chain. In 2015, just four percent of the new vehicle platforms included sensor-fusion engine-control units (ECUs) for surround-view park assistance and safety-critical functions; however, by 2025 21 percent will include them, the market researcher predicts.

The 20% compound annual growth rate (CAGR) for sensor-fusion ECUs between 2015 and 2025 is one of the highest growth rates for components used in the automotive industry – providing an opportunity for new suppliers to enter the automotive market, at various levels of the supply chain.

Related articles:
Audi rainmaker sketches new computing architecture for cars
NXP attacks Nvidia with number-crunching ADAS platform
Mobileye/ST Gun for Sensor Fusion, Go after NXP

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News


Linked Articles