In the early 1990s, cameras were first used in passenger car rear-view systems. Today, after a long history, cameras are the backbone of vision-based perception in ADAS and autonomous driving systems.
As ADAS applications proliferate and technology continues to evolve, high-quality image sensors are placed in and around vehicles to support functions such as rear view, forward view, interior monitoring, and 360° surround view. Automotive camera lens often has to exceed the visual capabilities of the human eye to achieve the required level of safety. Advances in semiconductor technology have enabled significant improvements in image sensing and processing capabilities, thereby broadening the range of ADAS functions.
Automotive cameras have two primary functions in any vehicle: they are used to display an image of the surroundings of the vehicle to the driver on a screen, serving vision applications; and cameras that provide decision input to the processing system serve sensing applications.
ADAS and autonomous driving functions require vision systems that can handle imaging, high-speed communications, and downstream image processing functions. Automotive vision systems powered by image recognition technology can effectively enhance the driver's vision and warn of potential safety hazards before they arise, which can then be corrected by the driver or driver assistance systems.
Car camera footage is ideal for vision-based object detection, such as other vehicles, pedestrians, cyclists, and traffic signals and road signs. Cameras are the only sensing technology that can perform color perception and read traffic and road signs. Vision technology has matured over the years and is fairly affordable. In addition, the camera module is very compact and can be easily integrated into the vehicle. With advanced image processing technology, the camera provides highly reliable object detection and classification.
However, cameras have their own limitations: they don't work well in low light and adverse weather conditions such as rain, fog, snow and muddy driving conditions. Pairing with other sensing technologies such as radar and lidar can compensate for these performance limitations.
Generally, a car camera module includes a lens, an image sensor, an image signal processor, and an image recognition platform. Let's take a look at them one by one. Let's start today and talk about the lens.
Camera optics play a vital role in determining the quality of the images produced. The role of the lens is to focus the image onto the image sensor. Therefore, the lens must be able to illuminate the entire camera sensor area to avoid shadows or vignetting in the resulting image. In an automotive camera module, the lens is mounted on an optics housing that is attached to a PCB board. Lens selection based on the image sensor affects resolution, FoV, depth of field, color reproduction and the overall sensitivity of the vision system.
Prime lenses with wide fields of view are the most popular types of lenses for automotive applications. Choosing the most suitable lens is directly related to the image sensor used in the sensor. This affects the achievable capture speed and thus the measurement accuracy and reliability of downstream analysis of captured images.