In recent years, autopilot has been developing rapidly. Waymo's autopilot fleet, or the first Audi A8 with LiDAR, shows the high heat in this area. However, the current "self-driving" vehicles include two types: one is an unmanned vehicle with embedded sensors, which carries cameras, lidars and radar to achieve full self-driving; the other is a passenger vehicle equipped with Advanced Driving Assistance System (ADAS), which only contains a limited number of sensors to achieve partial self-driving. According to Yole's recently updated "Lidar for Automobile and Industrial Applications - 2019 Edition" report, the automobile lidar market will grow to 4.2 billion US dollars by 2024.
Unmanned vehicles use multiple lidar systems to map the surrounding environment. In order to ensure occupant safety, lidar must be combined with camera and radar in order to achieve a high degree of security redundancy between sensors.
Level 3 ADAS, which can monitor driving environment, is still uncertain, because OEM's goal is partial automation. For this level of automatic driving, OEM will define the use case of auto-driving, that is, when the vehicle can be auto-driving. Currently, OEM usually points to two use cases: traffic congestion and/or highway driving. For example, OEM will choose when to enable autopilot mode and can define avoidance of night conditions. These OEM choices are mainly cost-driven, which will directly affect the type and number of sensors embedded in ADAS vehicles.
In addition, the performance of traditional sensors is still improving. The combination of cameras, radar and ultrasonic sensors is cheaper and sufficient to meet the requirements of this level of automatic driving. Because its cost should still be acceptable to customers, this solution is available for most OEM applications. For the high-end OEM and luxury ADAS product line, customers are willing to adopt the latest technology, so as to achieve differentiation, can provide another solution for the application of lidar. This solution can provide more advanced functionality or add more automatic driving scenarios.
The automatic driving vehicle needs to grasp the surrounding environment information in order to drive safely. There are many sensors that can be used to provide this information to autonomous vehicles, including cameras and Global Navigation Satellite System (GNSS) sensors. However, only three sensing models can provide direct range measurement: an ultrasonic sensor for short range measurement, a radar for object detection, and a lidar for 3D environmental sensing.
Nowadays, these sensors can be used in both passenger cars and driverless cars, but there are differences in their applications. Passenger cars have limited self-driving ability, and by definition, driverless cars must rely on sensor redundancy to achieve full self-driving. Because air can cause a large attenuation of ultrasound, the detection range of ultrasonic sensors is limited to several meters, and they are mainly used for parking assistance. Radar is commonly used in passenger vehicles for adaptive cruise control (ACC) and automatic emergency braking (AEB). Therefore, in Yole's latest report "Automotive Radar and Wireless Communications - 2019 Edition", it is predicted that the automobile radar market will exceed $8 billion in 2024.
Unmanned vehicles can use 4D imaging radar with large antenna arrays to achieve angular resolution of less than 1 degree, or tens of small UWB high resolution radars. However, the current size of 4D imaging radars is larger than 10 x 10 cm, which makes conventional car designers less likely to see them. At the same time, the detection range of UWB radar is limited to tens of meters. The lidar, which costs thousands of dollars, is not attractive to car manufacturers. However, they have high performance, such as 200 meters detection range and 0.1 degree angular resolution, and are widely favored by driverless car companies.
There is no doubt that these technologies will continue to develop in the future. One of the driving factors is to integrate them into passenger cars to achieve a three-level ADAS. In 2019, 3D radars will enter the market, bringing vertical field of view and Doppler sensing capabilities. The fusion of 3D radar, camera and GNSS sensors may be enough to achieve the lane-level accuracy required for Level 3 ADAS. This situation may seriously affect the business development of lidar and limit the application of lidar in the automobile market.
In addition, the further development of 4D imaging radar is expected to make it suitable for personal vehicles. The first batch of commercialized products may be on the market from 2025 to 2030, just in time to catch up with the market penetration of Grade 4 ADAS. For lidar sensors, thanks to the application of new technologies such as MEMS optical scanning and Flash lidar photodetector array, it is expected that the price of lidar sensors will be greatly reduced in the same period. Another key factor in the price reduction of lidar is the market scale effect, which is caused by the scale of output.
The sensor industry is so hot that no one can really predict how the 4-level ADAS and future sensor combinations will eventually develop. Therefore, the industry is investing in the most promising technology in the future. Radar and lidar performance and cost competitions have begun. Competition is conducive to driving technology forward. This deserves our close attention!