**5.1 Sensor camera for ADAS**

This advanced Camera (digital HDR CMOS cameras) with large dynamic range is well suited to poor light conditions and primary differences are due to its brightness.

A large number of digital interfaces are available with camera for automobiles along with digital signal processor and internal memory capacity. The camera generates processed video images for evaluation using software algorithm. It also help images transformed in to signals to merge with other sensor signals such as other as radar and lidar etc. Due to the inherent intelligence of the camera, all the signals are processed in the fusion mode to enable the ADAS to take correct decision. The camera used as sensor [9] is required to go through the quality management (ISO/ TS/16949 in the automobile industry and are suited for adaptability which is quick and flexible. Current digital camera system is continuously receiving raw data that is then processed and forwarded to the display unit for image display. This procedure is shown in **Figure 7**.

**221**

**Table 1.**

*Driver Assistance Technologies*

*DOI: http://dx.doi.org/10.5772/intechopen.94354*

Besides this, the **i**nfra**r**ed (IR) camera consists of several components. It is

*Video data transfer to head unit of camera through Ethernet, source: https://www.fierceelectronics.com/*

The **i**nfra**r**ed (IR) camera consists of several components. It is important to

In both systems a camera plays an important role in identifying radiation of objects. It may be mentioned that NIR technology offer an extra illumination by IR-headlights while the FIR systems is not characterized with special headlights. The primary difference between the two is picking up the extra-radiated objects by the NIR systems while FIR only accepts only the regular radiation of objects.

**Table 1** presents transmission of data rate from sensors [10]. **Figure 8** shows the

For purpose of measuring distance and creation of three-dimensional images of the environment, LIDAR system [11] is fitted and integrated ever more frequently into vehicles and mobile machines. A pulsed laser beam assesses the signal's transit time from the object back to the detector as shown in **Figure 8**. A highly sensitive

**Sensor Data rate required to transmit raw data**

**Camera** 1Gb/sec to 24Gb/sec **Radar** 5Gb/sec to 120Gb/sec **Lidar** 2 Mb/sec to 10Gb/sec

*Data rate required for transmission of data.*

important to distinguish 2 different versions of the IR camera:

distinguish 2 different versions of the IR camera:

*components/three-sensor-types-drive-autonomous-vehicles*

1.**N**ear **I**nfra **R**ed(NIR);

**Figure 7.**

2.**F**ar **I**nfra-**R**ed(FIR);

functioning of Lidar.

**5.2 LIDAR systems**

#### **Figure 7.**

*Models and Technologies for Smart, Sustainable and Safe Transportation Systems*

Fusion of data received from complementary and independent sources place the data into a single description. Data association and data assimilation are two important components to be addressed for data fusion as a part of the process that matches sensor data with the description of the environment that requires synchronization of the sensor data and the associated object state (e.g., position and

*Fusion of data at ECU received from various types of sensors housed in ADAS, Source: Ref No: [3].*

It is extremely important to know which sensors are required for autonomous driving from Levels 1 to 5. As already mentioned, there are three main groups of sensor systems camera-, radar-, and LIDAR-based systems. Although, for parking, ultrasonic sensors are available today and are widespread, they are of minor importance for autonomous driving. Camera and radar systems are in the Level 1 and 2

This advanced Camera (digital HDR CMOS cameras) with large dynamic range is well suited to poor light conditions and primary differences are due to its

A large number of digital interfaces are available with camera for automobiles along with digital signal processor and internal memory capacity. The camera generates processed video images for evaluation using software algorithm. It also help images transformed in to signals to merge with other sensor signals such as other as radar and lidar etc. Due to the inherent intelligence of the camera, all the signals are processed in the fusion mode to enable the ADAS to take correct decision. The camera used as sensor [9] is required to go through the quality management (ISO/ TS/16949 in the automobile industry and are suited for adaptability which is quick and flexible. Current digital camera system is continuously receiving raw data that is then processed and forwarded to the display unit for image display. This proce-

vehicles today and are prerequisite for all further levels of automation.

**220**

velocity).

**Figure 6.**

brightness.

**5. Various sensors of ADAS**

**5.1 Sensor camera for ADAS**

dure is shown in **Figure 7**.

*Video data transfer to head unit of camera through Ethernet, source: https://www.fierceelectronics.com/ components/three-sensor-types-drive-autonomous-vehicles*

Besides this, the **i**nfra**r**ed (IR) camera consists of several components. It is important to distinguish 2 different versions of the IR camera:

The **i**nfra**r**ed (IR) camera consists of several components. It is important to distinguish 2 different versions of the IR camera:

1.**N**ear **I**nfra **R**ed(NIR);

```
2.Far Infra-Red(FIR);
```
In both systems a camera plays an important role in identifying radiation of objects. It may be mentioned that NIR technology offer an extra illumination by IR-headlights while the FIR systems is not characterized with special headlights. The primary difference between the two is picking up the extra-radiated objects by the NIR systems while FIR only accepts only the regular radiation of objects.

**Table 1** presents transmission of data rate from sensors [10]. **Figure 8** shows the functioning of Lidar.

## **5.2 LIDAR systems**

For purpose of measuring distance and creation of three-dimensional images of the environment, LIDAR system [11] is fitted and integrated ever more frequently into vehicles and mobile machines. A pulsed laser beam assesses the signal's transit time from the object back to the detector as shown in **Figure 8**. A highly sensitive


**Table 1.**

*Data rate required for transmission of data.*

#### **Figure 8.**

*Principle of the functioning of LIDAR. Source: https://www.fierceelectronics.com/components/ three-sensor-types-drive-autonomous-vehicles.*

technique using Avalanche Photodiodes along with internal amplification measure the light pulses in the nanosecond range across wider bandwidths. Lidar optical system requires the high spatial resolutions. Therefore sensor has the capability to develop APD arrays comprising with multiple sensor elements. The APD arrays from sensor addresses the effect of temperature due to its high voltage. Their highly accurate amplification offers excellent APD signal quality. The modules can be adapted to as per the specific application. Development boards with digital output signal and Low Voltage Differentiating Signal (LVDS) is interfaced. With the help of Lidar and Radar System, the object of the road can easily be identified. But in addition to these, there is a necessity for a camera for classification and detection of an object in a correct way. With the development of point density cloud from the reflections from radar and lidar, the distance and closing speed of the object can easily be measured. It may be mentioned that due to lower resolutions from these sensors as compared to camera, the detection of the objects are not easily made. To optimize the detection at varying ranges with lower resolution, a number of units are installed from a medium-range unit for emergency brake assist to long-range radar for adaptive cruise control although LIDAR & radar, functions in a similar way at longer ranges with lower point-density.
