**1. Introduction**

One of the up-to-date new applications of LiDAR technology is its use in transport, surveillance, and security [1]. A pulsed laser is used to measure ranges (variable distances). Thanks to distance calculations, it is possible to perform 3D mapping using different approaches [2] and to recover the geometry of the scene and not just a projection as in a conventional camera.

LiDAR technology has been presented as a disruptive technology regarding computer vision, as it gives precise and real-time visualization of the surrounding area and its distribution of objects. It offers an outstanding performance toward the required specifications [3]. However, the feasibility of the instrument for day-to-day outdoor uses is still facing numerous questions, one of them related to its detection breakdown when working in adverse weather conditions [4–6].

The propagation of light through scattering media such as fog, rain, smoke, dust, or others shows two main problems: the attenuation of the pulse and the saturation of the sensor [7–9]. The first one is a result of the dispersive effect and absorption characteristics of the media, which leads to the loss of energy while pulse light is propagating. The second is related to the backscattering that light undergoes when just entering the media, which may blind the sensor [10]. Due to their nature, both phenomena can be easily studied with models if the optical properties of the media are known. When working with models, the results obtained are in accordance with the ones shown in experimental tests.

At this point, it is worth showing in **Figure 1** a point cloud obtained under fog conditions. The corresponding RGB image of the scene without fog is shown along three different views of the point cloud (with fog). Objects found in the scene are indicated on the RGB image with red letters. All point clouds are labeled as in the RGB image to facilitate its interpretation, i.e., the same letters are used inside black tags on the point clouds to point to objects. Z is the direction of propagation, normal to the RGB image and along the tunnel forming the fog chamber, Y refers to the height of the chamber and X to the width, distances are shown in meters [m]. On the left, we

**Figure 1.** *Different views of a point cloud under the presence of artificial fog. The scene used is shown in the RGB image without fog.*

present a 3D view selected with an adequate orientation to highlight the aspect of the point cloud. On top, there is a YZ view, i.e., a side view of the scene; on the right there is a ZX view, which is a top view of the scene. These views are useful to notice the spread of the point cloud around a determined object/distance, and the points appearing due to the backscattering of fog, especially just in front of the sensor. In conclusion, the point cloud is rather noisy and the range is limited.

Nowadays, there is no current solution to overcome the problem presented. The described effects are still present and seriously damage the performance of LiDAR systems in adverse weather conditions. However, there are several lines of research trying to find ways to improve this situation [11–16]. The challenge of this topic makes researchers work with the novelties in optical engineering (optical design, materials of the components, new sensors … ) along with the basis of physics of light (propagation of light, light-matter interactions … ) [1].

In this chapter, we want to review some basics of the physics of light to properly become aware of the problem. How does light propagate through any turbid media? How is the media characterized? Which is its effect when working with pulsed light? If one wants to face this problem with a plausible solution, it is necessary to know and understand the involved physical phenomena in depth. Modeling allows us to go deeper into what is happening. Thus, it will be also reviewed how models are conventionally approached and which is the state of the art on the topic.
