**6. Understanding the design of ADAS**

It is realized that in order to make the ADAS commercially viable, three aspects on designing, testing and validating are of great importance and challenge to researchers/ scientists and manufacturer. The processing and sharing of information requiring a huge computation effort, within its fusion system in real time situation is a complex and difficult task in view of the computational load and the time-constraints placed on the system.

The inertial navigation systems identify, measures position, orientation, and velocity measurements. The sensor of RT-Ranges [13] is responsible for creating a real-time network, which is capable of tracking multiple targets, calculating distance, time to collision, and other relative measurements. Targets include primarily road vehicles, vulnerable road users (VRUs) such as cyclists or pedestrians. Euro NCAP (The **European New Car Assessment Programme,)** targets traffic assets and more. Euro NCAP is a European car safety performance assessment programme. Data is available in real-time on a software dashboard captured to verify test outcomes. Vehicle-to-vehicle measurements can be made over a 1 km range. Many similar systems in different parts of the world are increasingly seen, all with a slightly different name.

Various system of ADAS associated with various sensors is presented in the **Table 4**. A number of sensors developed during the process of development of ADAS are briefly discussed below.

#### **Table 4.**

*Various sensors related to their applications. Source: Automotive ADAS Systems, ST Developers Conference, Sep, 12, 2019, Santa Clara Convention Centre, Mission City Ballroom, Santa Clara, CA.*

**227**

**Table 5.**

*Driver Assistance Technologies*

**6.1 Night vision**

*for night vision.*

detection).

**6.3 Near field collision warning**

**Sensor Property Comment**

*Available sensor and properties in night vision systems, source: [12].*

Infrared camera *l* = 800 nm **N**ear **I**nfra**R**ed (NIR) (CMOS) = 7–14 μm **F**ar **I**nfra**R**ed (FIR)

**6.2 Lane departure warning**

*DOI: http://dx.doi.org/10.5772/intechopen.94354*

During night vision, one is more concerned with the proper visibility where the camera plays an important role. Therefore the camera for this purpose is designed with the use of near or far infrared to improve the perception of the driver in dark conditions. The improved sight vision created by the above near or far- infrared camera is displayed in the monitors of the vehicle. Human Machine interface though poses an issue for correctly showing the road-side picture for timely intervention plays an important role to the driver to enhance the safety to the driver so that the driver is not distracted. **Table 5** *presents available sensors with their properties* 

Lane departure warning mechanism works on the principles of certain thresholds with respect to distance, time to lane crossing. It is based on the decision made out from the data fusion analysis supported with computer software algorithm to warn the driver that he or she is about commit mistake in departing traffic lane. For example, sensors such as acoustic, optic means continuously generate and analyze the data along with the video image processing data created by the vehicle cameras results in the detection of warning to the vehicle. In order to make the warning system effective, the carriage way would have to be laid with Good visible lane markings system. These influence the complexity of the system on the roadside. This system aims to prevent involuntary lane departure, which constitutes a relevant cause of road accidents. With real-time measurement and positional accuracy which is generally at less than 2 cm, the system captures the data that the sensor performs the task of lane departure action as shown in **Figure 11**. This warns the Lane Departure Warning system if the vehicle suddenly decides to change the lane without proper indication. The camera used for the lane detection system is low cost generally mounted on the windscreen near the rear view. The position of this location of the camera helps continuously capture the image of solid lane line marking of the road towards the front side of driving. it also works along with the front (adaptive cruise control and, ii) forward collision warning), side (lane departure warning), and iii) rear side (blind spot

There are multiple collision warning systems (12) mentioned on the **Table 6***.* The finest example of the application of near field collision warning is the detection of blind spot, which takes very close proximity of the presence of vehicle. Lidar, radar or vision based sensors are generally used. It may also be acoustical, haptical or optical also. In many cases, the frequency of this kind sensor is found to be 24 GHz. To test and develop blind-spot detection systems, it is necessary to

Both systems are mono-camera, mono-camera,
