**2. Sensors analysis and vehicle classification**

During the last years, electronic systems assumed a growing importance in the development of assisted and automated systems. In particular, the research has been focused around two main topics:


This work starts from the sensors evolution to give an insight of the different techniques and sensors families used in nowadays autonomous vehicles, in order to gain insight on the different issues related to their development and provide ideas useful to better integrate them into the traffic models, obtaining more fitting models and better results.

First of all, it is possible to define four types of sensors and related systems used in nowadays smart vehicles:


Moreover, the data coming in from the different sensors have to be considered as a whole, in order to have a redundant and robust system and to discriminate in the case of contrasting detections from different sensors. This exchange of information between the different systems is done using an internal network, such as a CAN bus, to avoid the instantiation of several sensors of the same type. Ideally, the aim of the data from and to these control units is to describe the status of the car and formulate the actions to be taken at any time. It is interesting to note, that with the new smart vehicles also systems to process and take into account the data coming from other vehicles have to be considered, causing a huge increase in the amount of data to be processed and, thus, an increase in the research for fast computing, stream processing systems and diagnostic protocols [2].

A careful examination of the various techniques used has to be developed, together with evaluation of the differences between the different types of sensors and how and when to use them, in order to achieve both a better understanding of their behavior and develop new models.

#### **2.1 Image sensors and processors**

The use of image sensors in automotive field goes back to vision guided auto-parking systems which has been around for the past twenty years [3]. Nevertheless, some of the main techniques are still used nowadays; the main topic has shifted from parking problems to active detection during driving. Several studies have been conducted on these applications, e.g. dedicated to pedestrian crossing and signal recognition. In particular in this type of problems the image sensor has to acquire the images from the environment at resolutions and acquisition rates suitable for the detection problem. Together with the sensor also the image processing methods have to be considered. In particular, to enhance the performances of the sensors in terms of processed frame rate per second (fps) hardware accelerators closely coupled to the image sensor are used. The accelerators are mainly dedicated to filtering operations on the incoming images from the sensors, aiming to send to the processing units only the significant data to be taken into account in the detection. Several systems have been developed, like the ones dedicated to edge detection and recognition, segmentation and object recognition. However, the trend is shifting to the use of Artificial Neural Networks capable of calibrating the various detection parameters according to the incoming stimulus and their previous history. Neural Networks hardware accelerators have been developed in the last years to make the systems capable of operating in real-time. While the training of the Neural Networks to determine the optimal parameters is usually done offline, some parameters could also be varied onthe-fly to achieve a resilience of the system to false detections or to improve the driving performances.

The challenge is nowadays to develop faster Neural Networks systems operating together with the image sensors, in order to process higher amount of data while operating with higher resolution frames; on the other hand, the development of new dedicated algorithms could also allow a complexity reduction of the problem which, in turn, could cause lower latencies in the system and better detection performances, both from a correctness and a speed standpoint.

#### **2.2 Radar sensors**

Radar systems are used in automotive to help solve problems like auto-parking, pre-crash sensing and adaptive cruise control, since one of the main characteristics of a radar system is the capability of inferring the velocity of the detected objects in

**205**

type of driver.

*Advanced Vehicles: Challenges for Transportation Systems Engineering*

its range. Radar systems used in automotive applications usually operate at 77 GHz frequency. The measurement time could be reduced to 10 ms allowing controlling

Radar systems are used together with other systems and results particularly efficient for detections at distances up to approximately 200 m, with a range resolution of 1 m [4]. These peculiarities make the radar system particularly appealing for mid-range applications in condition of non-optimal visibility, which could cause major problems to systems based on image sensors. However, radar systems have got poor object detection issues in medium ranges (30 m to 60 m) for angles larger than ±π/6; for that reason, they need to be supported by other detection systems, based on different technologies [5]. Moreover, it has to be highlighted that a typical radar system shows difficulties in taking into account targets having different azimuth values and, thus, more sophisticated data processing systems are needed, like tracking ones. These systems could show problems in the case of wrong modeling assumptions for the targets, causing detection problems, like

Lidar systems are capable of good performances in presence of adverse environmental conditions, such as fog, heavy rain or snow, but also in the case of scenes

However, also lidar systems show some drawbacks, due to the fact that in some cases the dynamic range of the lidar system could be exceeded. When it happens two extremely severe types of error occur: the loss of targets and the generation of ghost targets, which worsen the detection performances unacceptably in some

Ultrasonic systems are mainly used for parking assist systems, where they are capable of insuring low costs and good performances. Several threshold systems are designed to achieve the correct detection of corners and edges, while other systems take care of the steering angle and wheel speed; all is coordinated by a network, like the CAN bus previously mentioned [6]. Moreover, this kind of systems demonstrated to work well in cooperation with laser parking systems or to

Finally, it has to be highlighted that great importance has to be put on decision algorithms, which could be usually implemented in software and which could represent the bottleneck in some systems. In what follows we concentrate on the effect related to the sensors, assuming that the performances of the software are the same for all the considered systems in order to do not undermine the comparison results.

The schemes that is internationally used as standard for the classification of "Automated Vehicles" is the "SAE Levels" [8], written by "SAE INTERNATIONAL" (U.S.-based, globally active professional association and standards developing organization for engineering professionals in various industries) (**Figure 1**). As shown in **Figure 1**, a six levels taxonomy proposed; these six levels are collected in a meta-classification based on two macro categories (Human Driver e Automated Driving System) that identify the technology inside the vehicle and the

*DOI: http://dx.doi.org/10.5772/intechopen.94105*

the scene 100 times per second [4].

target loss.

cases [5].

**2.3 Lidar sensors**

having low illumination levels.

**2.4 Ultrasonic sensors**

substitute them [7].

**2.5 Classification issues**

### *Advanced Vehicles: Challenges for Transportation Systems Engineering DOI: http://dx.doi.org/10.5772/intechopen.94105*

its range. Radar systems used in automotive applications usually operate at 77 GHz frequency. The measurement time could be reduced to 10 ms allowing controlling the scene 100 times per second [4].

Radar systems are used together with other systems and results particularly efficient for detections at distances up to approximately 200 m, with a range resolution of 1 m [4]. These peculiarities make the radar system particularly appealing for mid-range applications in condition of non-optimal visibility, which could cause major problems to systems based on image sensors. However, radar systems have got poor object detection issues in medium ranges (30 m to 60 m) for angles larger than ±π/6; for that reason, they need to be supported by other detection systems, based on different technologies [5]. Moreover, it has to be highlighted that a typical radar system shows difficulties in taking into account targets having different azimuth values and, thus, more sophisticated data processing systems are needed, like tracking ones. These systems could show problems in the case of wrong modeling assumptions for the targets, causing detection problems, like target loss.
