**8. UWB Imaging**

## **8.1. Imaging in distributed multi-static sensor networks**

Although imaging in distributed multi-static sensor networks results in a rough image of the environment, its quality is usually better than the quality of the image obtained by a single bat-type sensor autonomously operating in an unknown environment. Sensor networks offer more diverse information for imaging algorithms. The resulting quality of the image greatly depends on the number and the positions of the illuminating and observing nodes. If there are enough illuminating and observing nodes available simultaneously, then the instantaneous information can be used even for imaging of time variant scenarios with moving objects. For imaging of static environments, one moving observer is enough. The moving observer collects data, and the image is built sequentially and improved gradually by an imaging algorithm. The time domain imaging algorithms are referred to as Kirchhoff migration. They rely on Born's approximation, which presumes undisturbed ray-optical propagation [9]. Their basic principle is depicted in Fig. 50 (a).

The illuminator transmits a signal at the fixed point [*xT*, *yT*]. At the variable position [*xRi*, *yRi*] the receiver collects the impulse responses. Assuming single bounce reflection and the propagation velocity *v*, the echo reflected from an object situated at the position [*xo*, *yo*] can be found at the time delay *τ<sup>i</sup>* = (*rT* + *rRi*)/*v* in the measured impulse response *Ri*(*τ*). One observation determines an ellipse around the transmitter-receiver pair. In order to focus the image, observations from different positions must be fused. Conventional migrations propose to fuse the observations by a simple summation

$$\rho(\mathbf{x}\_0, y\_\vartheta) = \frac{1}{N} \sum\_{i=1}^N R\_i(\mathbf{r}\_i) \tag{17}$$

**Figure 50.** Principle of imaging algorithms.

46 Will-be-set-by-IN-TECH

Research investigations and algorithms had to be transferred onto a new mobile platform to demonstrate the essential results obtained in CoLOR. Many challenges in the area of robotics, localization and robot motion resulted. Additionally, an optimal alignment of the sensors and robust UWB radar sensing conditions had to be taken into account for the motion of the robot. The motion algorithm for the robot when moving around the objects are designed to be contour adaptive, exhibit collision avoidance features, and use the shortest tracks through

0 0.5 1 1.5

Although imaging in distributed multi-static sensor networks results in a rough image of the environment, its quality is usually better than the quality of the image obtained by a single bat-type sensor autonomously operating in an unknown environment. Sensor networks offer more diverse information for imaging algorithms. The resulting quality of the image greatly depends on the number and the positions of the illuminating and observing nodes. If there are enough illuminating and observing nodes available simultaneously, then the instantaneous information can be used even for imaging of time variant scenarios with moving objects. For imaging of static environments, one moving observer is enough. The moving observer collects data, and the image is built sequentially and improved gradually by an imaging algorithm. The time domain imaging algorithms are referred to as Kirchhoff migration. They rely on Born's approximation, which presumes undisturbed ray-optical propagation [9]. Their basic

The illuminator transmits a signal at the fixed point [*xT*, *yT*]. At the variable position [*xRi*, *yRi*] the receiver collects the impulse responses. Assuming single bounce reflection and the propagation velocity *v*, the echo reflected from an object situated at the position [*xo*, *yo*] can be found at the time delay *τ<sup>i</sup>* = (*rT* + *rRi*)/*v* in the measured impulse response *Ri*(*τ*). One observation determines an ellipse around the transmitter-receiver pair. In order to focus the image, observations from different positions must be fused. Conventional migrations propose

*N*

*N* ∑ *i*=1

*<sup>o</sup>*(*xo*, *yo*) = <sup>1</sup>

Distance [m]

*Ri*(*τi*) (17)

Intensity

−2

0

2

0.2 0.4 0.6 0.8 1

Distance [m]

0.4 0.6 0.8 1 1.2

Distance [m]

**Figure 49.** Example of the inspected object o1. The obtained radargram (left), the wavefront-based imaging of the object with robot track (middle) and the same object with a migrated image (right).

**7.8. Transfer onto the mobile robot**

Number of Measurement

<sup>20</sup> <sup>40</sup> <sup>60</sup> <sup>80</sup> 0.5

**8. UWB Imaging**

principle is depicted in Fig. 50 (a).

to fuse the observations by a simple summation

Distance [m]

1 1.5 2 2.5 3

a room while maintaining optimal antenna alignments.

h

0

**8.1. Imaging in distributed multi-static sensor networks**

0.5

Distance [m]

1

1.5

−1 −0.5 0 0.5 1

norm [V]

where *N* is the number of observations available for the data fusion. A disadvantage of this fusion is indicated in Fig. 50 (a). The signal energy does not only cumulate at the desired pixel. Other local maxima arise at every position where ellipses are crossing. Moreover, the elliptical traces are added to the focused image as well. Both effects decrease the image quality. In [73] we have proposed a data fusion method which reduces these artifacts and enhances the image quality. The algorithm is based on the cross-correlation of multiple observations. The spatial scheme for the cross-correlation algorithm is indicated in Fig. 50 (b). The observation from the reference receiver Rx*ref* is cross-correlated with each observation of the moving receiver. The cross-correlation results in non-zero values only at positions where the ellipse related to the reference receiver crosses the ellipse related to the moving receiver. The non-zero values which result from the cross-correlation are summed by the data fusion algorithm according to

$$\rho(\mathbf{x}\_{0\prime}, y\_0) = \frac{1}{N} \sum\_{i=1}^{N} \int\_{-T/2}^{T/2} \mathbf{R}\_i(\tau\_i + \zeta) \mathbf{R}\_{ref}(\tau\_{ref} + \zeta) d\zeta \tag{18}$$

where *τref* = (*rT* + *rRre f*)/*v* is the time delay which is related to the echo measured by the reference receiver, and *T* is the pulse duration of the autocorrelation function of the stimulation signal. In order to improve image quality, the number of reference receivers can be increased. The performance of this imaging algorithm strongly depends on the spatial arrangement of the cross-correlated receivers. If the arrangement of the receivers is disadvantageously chosen, the performance of this algorithm tends toward the conventional migration described by (17). If the spatial arrangement is selected in an optimum way the performance of the cross-correlated imaging outperforms the conventional algorithm. In order to illustrate the differences between the conventional and the cross-correlated algorithm, data simulated by the ray tracing algorithm [60] were used for the data fusion. The simulated scenario consisted of a sensor node which moved through a rectangular room of size 8 m by 9 m. The inspected room contained seven objects, namely, one large distributed object (some meters) and six smaller objects (decimeters). The sensor node had one transmitting and two receiving antennas and operated in the frequency band from 4.5 GHz to 13.4 GHz. The antennas were directional, therefore, the moving sensor scanned the room by a full turn at selected positions. Almost 10 thousand impulse responses were simulated for this scenario. Figure 51 (a) shows the result obtained by the conventional migration algorithm. The focused image contains intersecting ellipses. However, as indicated before, the intersections are not exclusively at the target positions. The ellipses also intersect at positions where no target

48 Will-be-set-by-IN-TECH 226 Ultra-Wideband Radio Technologies for Communications, Localization and Sensor Applications Cooperative Localization and Object Recognition in Autonomous UWB Sensor Networks <sup>49</sup>

is situated and even traces of these ellipses make the interpretation of the image difficult. Figure 51 (b) illustrates the image of the same scenario obtained by the cross-correlated algorithm. The color scaling of both images in Fig. 51 (a) and (b) is the same. The reduction of image artifacts by means of the cross-correlated algorithm is evident. Elliptical traces were significantly reduced. This helps to identify the inspected environment in this figure more clearly. A more detailed discussion on the cross-correlated imaging with measurement examples is given in [74] and [75].

**L***l*

**L***L*

*Rm*

*RM*

**L**1

wall

*R*1

segment with the foci *Rm* and *Tn*.

**Figure 53.** Flow chart of the algorithm .

*di f f*

in which the parameter *t*

operation is described as

position *Tn*, and *t*

operation,

*R*2

*<sup>T</sup>*<sup>1</sup> *<sup>n</sup> TN <sup>T</sup>*

**Figure 52.** Measurement configuration. *T*1...*Tn*...*TN* are different transmission positions on the track **L***l*. "*R*1...*RM*" are sparsely spaced receivers with different reception angles. The dashed curve is an ellipse

> Signal Enhancement (time-shift & accumulation)

Gaussian clutter and noise

Stationary Target detection Detector design based on reflection characteristic deviations

designed based on Gaussian clutter and noise. Mathematically, the "*time-shift* & *accumulation*"

*sRm*,*Tn*(*t* + *t*

where *sRm*,*Tn*(*t*) is the signal received by the receiver *Rm* with respect to the transmission

are the position vectors of *Tn* and reference transmission position *T*<sup>1</sup> , respectively. In the

• the responses of the objects (*Oi*) in the direction of interest (for example the direction of **L***<sup>l</sup>* in Fig. 52, are time-shifted, aligned and then accumulated. It is a coherent operation

• The responses of objects (*Cj*) out of the direction of interest, are non-uniformly time-shifted, disturbed and then accumulated. It is an incoherent operation. As a consequence, for a certain time-slot, the responses from objects located on different ellipses (or ellipsoids) are non-uniformly time-shifted and accumulated. Hence, these responses are attenuated

*di f f*

*Tn*,*T*<sup>1</sup> = |**T***<sup>n</sup>* − **T**1| /*c* is the time-delay to be compensated for. **T***<sup>n</sup>* and **T**<sup>1</sup>

*Tn*,*T*<sup>1</sup> is used for time-delay correction. Hence, responses are

*N* ∑ *n*=1

*di f f*

enhanced by *N* times compared to the case of single channel data.

compared to the output of the coherent operation above, [70].

Unknown environment

Down Converter

*Cj*

Cooperative Localization and Object Recognition in Autonomous UWB Sensor Networks 227

Target

*Oi*

*Tn*,*T*1) (19)

Clutter

**Figure 51.** Imaging in sensor networks.
