Be first to read the latest tech news, Industry Leader's Insights, and CIO interviews of medium and large enterprises exclusively from Embedded Advisor
The advancements in embedded sensors are enabling the automobile industry to enhance the ADAS design of the self-driven vehicles.
FREMONT, CA: Sensors are gaining traction in the automobile industry. The push towards self-driven cars is also encouraging embedded manufacturers to come up with embedded sensors. The computing power and advanced machine intelligence in the autonomous vehicles will assist the centralized processing unit in analyzing multiple, sometimes conflicting data streams to provide a single accurate overview of the environment. Such capabilities will largely boost the prospects of developing advanced driver-assistance systems (ADAS) with processors, sensors, and central fusion units to interpret the massive amount of data collected via various sensors.
There are two popular configurations for the embedded sensors that are being used in autonomous vehicles: raw data sensor fusion platform and a hybrid sensor fusion platform. The raw data sensor fusion configuration involves the powerful computer at the center, where all the data from the embedded sensors is processed. Although simple, such configuration requires high-speed data links to transfer data from various sensors to the central sensor fusion CPU or GPU in real-time. However, if implemented successfully, the above configuration can significantly boost the prospects of achieving L3+ automation driving.
Alternatively, the hybrid sensor fusion platform involves a part of computation to be carried out at the end of the embedded sensors. Thus, the pre-processing of data allows the central sensor fusion platform to be more lightweight. Another advantage of this configuration is that pre-processed data needs a significantly lower data rate at the sensor interface.
The key embedded sensors for autonomous vehicles include embedded cameras, radar, and lidar systems. A higher level of automated driving requires high-definition cameras with higher frame rates for object recognition purposes. However, poor visibility conditions can impact the performance of an embedded camera. Radars can be useful incorporation in the above scenario as they can operate even in difficult weather and light conditions. However, with an angular resolution of 1.2°, radars have a range accuracy of about 10 cm. An embedded lidar system offers the best of the above two sensors as it can capture a 3D map of the region surrounding a car with superior resolution than a camera. Further, the lidar system has an angle resolution of 0.1°, and the range accuracy is 5cm or less.
The above-mentioned embedded sensors back the evolution of autonomous vehicles. Further advancement in embedded sensors might even open the gateway to ADAS with automation.
See Also: Top Electronics Design & Solution Companies
I agree We use cookies on this website to enhance your user experience. By clicking any link on this page you are giving your consent for us to set cookies. More info
However, if you would like to share the information in this article, you may use the link below:
www.embeddedadvisor.com/news/how-embedded-sensors-broaden-prospects-for-adas-nid-556.html