**7.7. Object recognition for full and restricted circumnavigation**

The object recognition (OR) method proposed in this work is part of the previously introduced super-resolution radar imaging system. The investigated objects and the reference alphabet derived from these consist of simple canonical and some polygonal complex objects shown in Fig. 44.

**Figure 43.** Results of the depolarization investigation of the object with square cross section, with 0.3 m edge length and the bin with same inner dimension, respectively. The depolarization coefficient *P*VH/*P*VV of the smooth surface is depicted on the left. The same coefficient is shown of the bulk goods in the middle and on the right side.

As mentioned above, the OR algorithms in [50] and [48] yield very robust results with the method of moment invariants and Fourier descriptors, which was proven for images obtained on complete circular tracks around the objects. However, in many cases such complete tracks are not possible as they lead to unfinished object images. To perform an OR also in such situations the method of Curvature Scale Space (CSS) was applied due its ability to robustly recognize contour parts [35].

### *7.7.1. Object recognition algorithm with the curvature scale space*

42 Will-be-set-by-IN-TECH

In Fig. 42, it can clearly be seen that only the corners are focused in the resulting image but not the flat structures. The reason for this effect is that a flat plate does not depolarize but a

Thus, the detection capability of a UWB radar system can be improved by exploiting polarization diversity. Under certain circumstances, the radar images detect object features which would have remained invisible in mono polarized radar systems. Supplementary information about the contour, orientation and dimensions of the object can thus be obtained,

One possible discrimination criterion between smooth and rough surfaces is to exploit specific depolarization effects. Rough in this context means that the standard deviation of the outer surface height distribution is in the range of some wavelengths of the operating carrier frequency, i.e. 9 GHz in this case. To obtain results which are independent both of material

Four metallic objects with smooth surfaces were used for the investigations, 2 objects with a square cross section of different size, an object with isosceles triangular cross section, and an object with a rectangular cross section. Details are shown in Fig. 44. For comparison with rough surfaces 4 polystyrene bins were built which were filled with bulk goods made of chunky gas concrete, chunky sand-lime brick, medium density fiberboard (MDF) blocks with 0.01 m edge length and M16 × 25 mm screws. Due to the material composition, the polystyrene bin itself has a vanishing radar cross section so that reflections caused by the

For depolarization investigations, these 8 objects under test were scanned in both co- as well as in both cross-polarized configurations on a circular track with a radius of 1 m at a 1◦ grid.

In Fig. 43 the results for an object with a square cross section are compared with those of bulk good objects. Expectedly, objects with a smooth surface depolarize least, i.e. about -20 dB in the mean of all measurements. This complies with the 20 dB cross-polarization suppression of the antenna characteristic. The depolarization is least when an edge with predominantly vertical orientation is illuminated. Gas concrete and MDF are the bulk goods which depolarize most. The higher the permittivity is the more the bulky material depolarizes, i.e. sand-lime brick with about 8 dB more and the screws which depolarize most with over 10 dB more than

So UWB radar seems to be capable of discriminating objects of different materials by the roughness of their surface, subject to the condition that the height deviation is not much smaller than the wavelength. These results highlight the superior capabilities of fully

The object recognition (OR) method proposed in this work is part of the previously introduced super-resolution radar imaging system. The investigated objects and the reference alphabet derived from these consist of simple canonical and some polygonal complex objects shown in

The depolarization is expressed by the relation of signal powers *P*VH/*P*VV.

polarimetric systems and recommend their use in future radar systems.

**7.7. Object recognition for full and restricted circumnavigation**

which upgrades super-resolution UWB radar significantly.

*7.6.2. Polarimetric investigation of bulk goods with rough surfaces*

and shape, extensive measurements were performed.

45◦-dihedral does.

bin are negligible.

the objects with a flat surface.

Fig. 44.

The CSS representation is invariant against rotation of objects as this causes a circular shift of the CSS which has no effect on the recognition process, since the CSS of an object under test is compared by a correlation to the CSS of all reference objects. Moreover, CSS is highly robust against noise as most of its influence is compensated for to some degree by a smoothing Gaussian filter. Another property of the CSS is that it retains the local properties of shapes. Each peak of the CSS corresponds to a concavity or a convexity. A local deformation of the shape causes a change just in the corresponding local contour of the CSS image. Thus, a restricted curve can exactly be found in the CSS of the whole curve. Moreover, the absolute value of a CSS peak indicates the curvature radius, and the algebraic sign of the peak indicates whether the curve is concave or convex.

In Fig. 45 a plane curve with 8 convex or concave parts is drawn in blue. The Gaussian smoothed curve is drawn in red. On the right side of Fig. 45 , the corresponding curvature of the smoothed red curve is plotted.

#### 44 Will-be-set-by-IN-TECH 222 Ultra-Wideband Radio Technologies for Communications, Localization and Sensor Applications Cooperative Localization and Object Recognition in Autonomous UWB Sensor Networks <sup>45</sup>

<sup>1</sup> <sup>2</sup> <sup>3</sup> <sup>4</sup> <sup>5</sup> <sup>6</sup> <sup>7</sup> <sup>8</sup> <sup>9</sup> <sup>10</sup> <sup>11</sup> <sup>12</sup> <sup>0</sup>

<sup>1</sup> <sup>2</sup> <sup>3</sup> <sup>4</sup> <sup>5</sup> <sup>6</sup> <sup>7</sup> <sup>8</sup> <sup>9</sup> <sup>10</sup> <sup>11</sup> <sup>12</sup> <sup>0</sup>

<sup>1</sup> <sup>2</sup> <sup>3</sup> <sup>4</sup> <sup>5</sup> <sup>6</sup> <sup>7</sup> <sup>8</sup> <sup>9</sup> <sup>10</sup> <sup>11</sup> <sup>12</sup> <sup>0</sup>

has the same form as o5 or o6.

Object Nr.

(left), 270◦ track (middle) and 180◦ track (right).

Object Nr.

(left), 270◦ track (middle) and 180◦ track (right).

recognized anymore and is confused with o3.

(left) and 180◦ track (right).

Object Nr.

unexpected since o3 equals a part of o1.

<sup>1</sup> <sup>2</sup> <sup>3</sup> <sup>4</sup> <sup>5</sup> <sup>6</sup> <sup>7</sup> <sup>8</sup> <sup>9</sup> <sup>10</sup> <sup>11</sup> <sup>12</sup> <sup>0</sup>

Figure 46 shows that object o1 can clearly be recognized even with the restricted track of 270◦. Only for the 180◦ angle restriction, object o3 is recognized instead of o1 which is not

<sup>1</sup> <sup>2</sup> <sup>3</sup> <sup>4</sup> <sup>5</sup> <sup>6</sup> <sup>7</sup> <sup>8</sup> <sup>9</sup> <sup>10</sup> <sup>11</sup> <sup>12</sup> <sup>0</sup>

**Figure 47.** Discrete probability density function of the recognition process of object o5 for a full track

Figure 47 shows that object o5 can be recognized in about 80% of all cases for a full track. This value drops to about 60% when the track is restricted to 270◦. For 180◦ track o5 cannot be

<sup>1</sup> <sup>2</sup> <sup>3</sup> <sup>4</sup> <sup>5</sup> <sup>6</sup> <sup>7</sup> <sup>8</sup> <sup>9</sup> <sup>10</sup> <sup>11</sup> <sup>12</sup> <sup>0</sup>

Object o6 has approximately the same recognition characteristic as o5, as shown in Fig. 48. As both edge lengths of o6 are 0.3 m, o1 is falsely recognized as a certain part of its cross section

**Figure 48.** Discrete probability density function of the recognition process of object o6 for a full track

Object Nr.

Object Nr.

<sup>1</sup> <sup>2</sup> <sup>3</sup> <sup>4</sup> <sup>5</sup> <sup>6</sup> <sup>7</sup> <sup>8</sup> <sup>9</sup> <sup>10</sup> <sup>11</sup> <sup>12</sup> <sup>0</sup>

<sup>1</sup> <sup>2</sup> <sup>3</sup> <sup>4</sup> <sup>5</sup> <sup>6</sup> <sup>7</sup> <sup>8</sup> <sup>9</sup> <sup>10</sup> <sup>11</sup> <sup>12</sup> <sup>0</sup>

Object Nr.

Object Nr.

0.2 0.4 0.6 0.8 1

0.2 0.4 0.6 0.8 1

Probability of occurrence

Probability of occurrence

**Figure 46.** Discrete probability density function of the recognition process of object o1 for 270◦ track

Object Nr.

Cooperative Localization and Object Recognition in Autonomous UWB Sensor Networks 223

0.2 0.4 0.6 0.8 1

0.2 0.4 0.6 0.8 1

0.2 0.4 0.6 0.8 1

Probability of occurrence

Probability of occurrence

Probability of occurrence

0.2 0.4 0.6 0.8 1

0.2 0.4 0.6 0.8 1

0.2 0.4 0.6 0.8 1

Probability of occurrence

Probability of occurrence

Probability of occurrence

**Figure 44.** Cross-section of the 12 reference objects. The first 6 objects o1 - o6 were used for experimental OR investigations.

**Figure 45.** A plane curve in blue with its Gaussian smoothed version on the left, and the curvature of the smoothed curve on the right.

### *7.7.2. Object recognition performance test*

The previously discussed OR algorithm based on the comparison of CSS was applied to Objects o1 - o6 in a vast measurement series. Every object under test was measured for 400 different orientations and different locations with an offset up to 0.2 m for a full circular track, a restricted track of 270◦ and 180◦, respectively.

In the evaluations, every CSS of an object under test was compared with all 12 reference CSS data by a correlation function. The global maximum of the 12 correlation functions then indicated the recognized object.

44 Will-be-set-by-IN-TECH

**Figure 44.** Cross-section of the 12 reference objects. The first 6 objects o1 - o6 were used for experimental

**Figure 45.** A plane curve in blue with its Gaussian smoothed version on the left, and the curvature of

The previously discussed OR algorithm based on the comparison of CSS was applied to Objects o1 - o6 in a vast measurement series. Every object under test was measured for 400 different orientations and different locations with an offset up to 0.2 m for a full circular track,

In the evaluations, every CSS of an object under test was compared with all 12 reference CSS data by a correlation function. The global maximum of the 12 correlation functions then



*7.7.2. Object recognition performance test*

the smoothed curve on the right.

indicated the recognized object.

a restricted track of 270◦ and 180◦, respectively.

 

OR investigations.

 


**Figure 46.** Discrete probability density function of the recognition process of object o1 for 270◦ track (left) and 180◦ track (right).

Figure 46 shows that object o1 can clearly be recognized even with the restricted track of 270◦. Only for the 180◦ angle restriction, object o3 is recognized instead of o1 which is not unexpected since o3 equals a part of o1.

**Figure 47.** Discrete probability density function of the recognition process of object o5 for a full track (left), 270◦ track (middle) and 180◦ track (right).

Figure 47 shows that object o5 can be recognized in about 80% of all cases for a full track. This value drops to about 60% when the track is restricted to 270◦. For 180◦ track o5 cannot be recognized anymore and is confused with o3.

**Figure 48.** Discrete probability density function of the recognition process of object o6 for a full track (left), 270◦ track (middle) and 180◦ track (right).

Object o6 has approximately the same recognition characteristic as o5, as shown in Fig. 48. As both edge lengths of o6 are 0.3 m, o1 is falsely recognized as a certain part of its cross section has the same form as o5 or o6.

### **7.8. Transfer onto the mobile robot**

Research investigations and algorithms had to be transferred onto a new mobile platform to demonstrate the essential results obtained in CoLOR. Many challenges in the area of robotics, localization and robot motion resulted. Additionally, an optimal alignment of the sensors and robust UWB radar sensing conditions had to be taken into account for the motion of the robot. The motion algorithm for the robot when moving around the objects are designed to be contour adaptive, exhibit collision avoidance features, and use the shortest tracks through a room while maintaining optimal antenna alignments.

*x* 

Tx

**Figure 50.** Principle of imaging algorithms.

Object [*x0,y0*]

[*xT,yT*] [*xR1,yR1*]

(a) Kirchhoff migration

*<sup>o</sup>*(*xo*, *yo*) = <sup>1</sup>

*N*

*N* ∑ *i*=1 *T* /2

−*T*/2

where *τref* = (*rT* + *rRre f*)/*v* is the time delay which is related to the echo measured by the reference receiver, and *T* is the pulse duration of the autocorrelation function of the stimulation signal. In order to improve image quality, the number of reference receivers can be increased. The performance of this imaging algorithm strongly depends on the spatial arrangement of the cross-correlated receivers. If the arrangement of the receivers is disadvantageously chosen, the performance of this algorithm tends toward the conventional migration described by (17). If the spatial arrangement is selected in an optimum way the performance of the cross-correlated imaging outperforms the conventional algorithm. In order to illustrate the differences between the conventional and the cross-correlated algorithm, data simulated by the ray tracing algorithm [60] were used for the data fusion. The simulated scenario consisted of a sensor node which moved through a rectangular room of size 8 m by 9 m. The inspected room contained seven objects, namely, one large distributed object (some meters) and six smaller objects (decimeters). The sensor node had one transmitting and two receiving antennas and operated in the frequency band from 4.5 GHz to 13.4 GHz. The antennas were directional, therefore, the moving sensor scanned the room by a full turn at selected positions. Almost 10 thousand impulse responses were simulated for this scenario. Figure 51 (a) shows the result obtained by the conventional migration algorithm. The focused image contains intersecting ellipses. However, as indicated before, the intersections are not exclusively at the target positions. The ellipses also intersect at positions where no target

*rT rR1*

Rx

[*xRi,yRi*]

*x* 

Rxref

Tx

*rRref*

Cooperative Localization and Object Recognition in Autonomous UWB Sensor Networks 225

[*xT,yT*] [*xR1,yR1*]

reference receiver Rxref

(b) Cross-correlated imaging

*rT rR1*

Rx

Non-zero cross-correlation between Rxi and

*Ri*(*τ<sup>i</sup>* + *ζ*)*Rref*(*τref* + *ζ*)*dζ* (18)

[*xRi,yRi*]

[*xR2,yR2*]

Object

*y* 

where *N* is the number of observations available for the data fusion. A disadvantage of this fusion is indicated in Fig. 50 (a). The signal energy does not only cumulate at the desired pixel. Other local maxima arise at every position where ellipses are crossing. Moreover, the elliptical traces are added to the focused image as well. Both effects decrease the image quality. In [73] we have proposed a data fusion method which reduces these artifacts and enhances the image quality. The algorithm is based on the cross-correlation of multiple observations. The spatial scheme for the cross-correlation algorithm is indicated in Fig. 50 (b). The observation from the reference receiver Rx*ref* is cross-correlated with each observation of the moving receiver. The cross-correlation results in non-zero values only at positions where the ellipse related to the reference receiver crosses the ellipse related to the moving receiver. The non-zero values which result from the cross-correlation are summed by the data fusion algorithm according to

[*xR2,yR2*]

*y* 

**Figure 49.** Example of the inspected object o1. The obtained radargram (left), the wavefront-based imaging of the object with robot track (middle) and the same object with a migrated image (right).
