The future of autonomous driving and the role that cameras, radars, and LiDARs will play in it have been hot topics of discussion in the automotive sector. The cost and viability of each sensor type have been the main topics of dispute, leading some automakers to abandon radar and LiDAR technologies in favor of camera-based autonomous technology.
In spite of the fact that each form of sensor has advantages and disadvantages of its own, all three technologies are essential to the development of autonomous technology.
- Cameras are excellent at categorizing things and interpreting street signs, but they have limitations in bad weather.
- Radars are excellent in bad weather and provide precise speed and distance data, but they are unable to read signs and have trouble with traffic lights.
- Although pricey, LiDARs are quite accurate in measuring objects.
Although radar has been used in the car industry since the 1980s, technology has been the focus of intense discussion for years. This is due to the fact that older car architectures have a cap on the quantity of data that radars can provide to the CPU.
Does this spell the end for radar in automotive technology? Possibly not.
Massive volumes of data are produced by radars, but the current networks do not have the bandwidth to handle the transmission of this raw data from radars to ECUs.
Due to this, radar manufacturers are now required to build local processing capabilities for their radars.
This is sub-optimal from a system standpoint since no single ECU can receive all the raw data and make a reliable judgment based on it.
The move from this architecture—where the radar and ECU are connected at the edge—to a satellite architecture is necessary for radar makers to make headway in autonomous systems. This would allow an ECU to obtain raw data from several sensors by moving the data processing to a more central position in the car.
This is in line with the current automotive industry trend of sensor fusion, which enables a vehicle’s central processing unit to take into consideration several radars, cameras, and LiDARs—each of which has its own strengths and shortcomings. In this era of software-defined vehicles, sensor fusion has the potential to drastically reduce complexity and costs while assisting automobiles in making safer decisions.
When analyzing the data from each radar sensor locally, it is more difficult to use AI and machine learning algorithms, which may be used by consolidated processors in the automobile to extract considerably more information from raw data. Such algorithms may be developed by a sizable number of software companies, which may provide even greater accuracy and performance.
Other benefits of the satellite radar system exist. Radars will use a lot less power and be cheaper if the sensor unit’s labor-intensive local processing is eliminated.
OEMs place a high focus on power efficiency since, with the electrification of the engine, the amount of power used by the different electronics in the car directly affects how far it can go.
Cost-saving measures are essential since radars are frequently put in exposed areas of the car that are subject to damage and collision.
The high bandwidth, low latency raw data tunneling required by satellite architecture is between the host processor and the remote sensors.
The MIPI A-PHY standard seeks to provide high-speed communication solutions of this nature.
The industry finally has a streamlined, standardized method for transmitting high-bandwidth raw data from radars to the main ECU.
In order to enable the future phases of autonomous driving and reduce ambiguity surrounding the navigation environment, all sensor types will be crucial and that is where sensor fusion comes into play. Radar will continue to be essential in this area going forward since centralized processing has the potential to significantly improve the way that multiple sensor systems are now implemented.