**2. Architecture of ADAS**

There are a number of sensors which are increasingly being used. These are namely cameras, medium and long-range radar, ultrasonic, and LIDAR.. Data generated from these sensors go through fusion process to authenticate the data so as to enable the computer software perform the necessary tasks to activate the driver assistance system to take correct decisions. These decisions are related to parking assistance, automatic emergency breaking, pedestrian detection, surrounding view, and even drowsiness of the driver. The functional components such as various types of sensors collecting data from immediate surrounding environment are related to ADAS architecture that helps to perform necessary tasks as shown in the **Figure 1**. The forward collision-avoidance ECU module is located in the windshield, supported with the blind spot ultrasonic sensors and related ADAS processor may be located in the side mirrors or other location areas.

The **architecture** [2–4] of the electronic control units (ECUs) is responsible for executing advanced driver assistance systems (**ADAS**) in vehicles which is changing for its response during the process of driving. Automotive system architect integrates multiple applications into **ADAS** ECUs that serve multiple functions of ITS architecture as shown in the **Figure 2**. **Figures 3** and **4** show Architecture for other functions related to Forward Collision and Parking Assistance respectively.

Hardware architecture of ADAS and autonomous driving, includes automotive Ethernet, TSN, Ethernet switch and gateway, and domain controller while Software architecture of ADAS and autonomous driving, including AUTOSAR Classic and Adaptive, ROS 2.0 and QNX.

**217**

**Figure 3.**

*Driver Assistance Technologies*

**Figure 1.**

**Figure 2.**

*Architecture of ADAS, source: Ref [3].*

*DCS/ADAS/index.htm*

*DOI: http://dx.doi.org/10.5772/intechopen.94354*

*Functional components and various types of sensors. Source: http://www.hitachi-automotive.us/Products/oem/*

*Architecture of forward collision avoidance & blind spot avoidance. Source: Ian Riches, strategy analytics.*

*Driver Assistance Technologies DOI: http://dx.doi.org/10.5772/intechopen.94354*

#### **Figure 1.**

*Models and Technologies for Smart, Sustainable and Safe Transportation Systems*

During the gradual emergence of Connected and Automated vehicle (CAV), driver behavior modeling (DBM) coupled with simulation system modeling appears to be an instrumental in predicting driving maneuvers, driver intent, vehicle and driver state, and environmental factors, to improve transportation safety and the driving experience as a whole. These models can play an effective role by incorporating its desired safety-proof output into Advanced Driver Assistance System (ADAS. To cite an example, it could be said with confidence that the information generated from all types of sensors in an ADAS driven vehicle with accurate lane changing prediction models could prevent road accidents by alerting the driver ahead of time of potential danger. It is increasingly felt that DBM developed by incorporating personal driving incentives and preferences, with contextual factors such as weather and lighting, is still required to be refined, calibrated and validated to make it robust so that it turns into more better personalized and generic models. In regard to the modeling of personalized navigation and travel systems, earlier studies in this area have mainly considered ideal knowledge and information of the road network and environment, which does not seem to be very realistic. More researches are required to be conducted

to address this real life challenges to make ADAS more acceptable to society. There are an increasing evidences from the various literatures that a single vehicle making inferences based on sensed measurement of the driver, the vehicle, and its environment is mostly focused for DBM where there is any hardly attempt made to develop DBM in the traffic environment in the presence of vehicle to vehicle (V2V), and vehicle to infrastructure (V2I) scenario- communications system. It would be interesting to develop DBM with respect to connected and automated vehicle (CAV) to leverage information from multiple vehicles so that more global behavioral models can be developed.. This would be useful to apply the output of the CAV modeling in the design of ADAS driven vehicle to create a safety

There are a number of sensors which are increasingly being used. These are namely cameras, medium and long-range radar, ultrasonic, and LIDAR.. Data generated from these sensors go through fusion process to authenticate the data so as to enable the computer software perform the necessary tasks to activate the driver assistance system to take correct decisions. These decisions are related to parking assistance, automatic emergency breaking, pedestrian detection, surrounding view, and even drowsiness of the driver. The functional components such as various types of sensors collecting data from immediate surrounding environment are related to ADAS architecture that helps to perform necessary tasks as shown in the **Figure 1**. The forward collision-avoidance ECU module is located in the windshield, supported with the blind spot ultrasonic sensors and related ADAS processor may be

The **architecture** [2–4] of the electronic control units (ECUs) is responsible for executing advanced driver assistance systems (**ADAS**) in vehicles which is changing for its response during the process of driving. Automotive system architect integrates multiple applications into **ADAS** ECUs that serve multiple functions of ITS architecture as shown in the **Figure 2**. **Figures 3** and **4** show Architecture for other

Hardware architecture of ADAS and autonomous driving, includes automotive Ethernet, TSN, Ethernet switch and gateway, and domain controller while Software architecture of ADAS and autonomous driving, including AUTOSAR Classic and

functions related to Forward Collision and Parking Assistance respectively.

proof driving-scenario for diverse applications.

located in the side mirrors or other location areas.

**2. Architecture of ADAS**

Adaptive, ROS 2.0 and QNX.

**216**

*Functional components and various types of sensors. Source: http://www.hitachi-automotive.us/Products/oem/ DCS/ADAS/index.htm*

#### **Figure 2.**

*Architecture of ADAS, source: Ref [3].*

## **Figure 3.**

*Architecture of forward collision avoidance & blind spot avoidance. Source: Ian Riches, strategy analytics.*

#### **Figure 4.**

*Architecture of ADAS -Parking Avoidance & Blind. Source: http://www.techdesignforums.com/practice/ technique/managing-the-evolving-architecture-of-integrated-adas-controllers/.*
