**2. System architecture**

2 Will-be-set-by-IN-TECH

autonomous navigation of mobile sensor nodes, such as maneuverable robots- in an unknown

The objective of the CoLOR project (Cooperative Localization and Object Recognition in Autonomous UWB Sensor Networks) was to develop and demonstrate new principles for localization, navigation and object recognition in distributed sensor networks based on UWB radio technology. The application scenario of the CoLOR project is described by mobile and deployable sensor nodes cooperating in an unknown or even hostile indoor environment without any supporting infrastructure as it may occur in emergency situations such as fire disasters, earthquakes or terror attacks. In this case, UWB can be used to identify hazardous situations such as broken walls, locate buried-alive persons, roughly check the integrity of building constructions, detect and track victims, etc. In this scenario, it is assumed that optical cameras and other sensors cannot be used. Data fusion of optical image information and UWB

A possible scenario is shown in Fig. 1. The unknown environment is first explored by autonomous robots that deploy fixed nodes at certain positions. Those nodes being able of transmitting and receiving shall play the role as anchor nodes. They span a local coordinate system and should be placed at "strategic" positions, i.e. they span a large volume and ensure a complete illumination of the environment. Moving nodes will localize themselves relative to this anchor node reference. Moreover, when moving they collect information about the structure of the environment by receiving reflected waves. This way, they build an "electromagnetic image" of the environment and recognize basic building structures ("landmarks"). Step by step a map of the environment is built which can be used as another reference for navigation. This procedure is already well known from autonomous robot navigation as SLAM (Simultaneous Localization and Mapping). However, here we do not consider optical methods but UWB to recognize the feature vector. If there are solitary objects, the moving sensor may scrutinize shape and material composition by circling around. Other objects like humans may walk around and create time-variant reflections that identify their

x

environment that may arise as a result of an emergency situation.

y

radar was not in the scope of this project.

**Figure 1.** CoLOR scenario.

## **2.1. Sensor network architecture**

To accomplish the tasks described above, the UWB sensor nodes used can be heterogeneous in terms of their sensing capabilities and mobility. Most simple nodes may act just as illuminators of the environment. This requires only transmitting operation, but no sensing and processing capability. However, multiple transmit signal access has to be organized. This could be CDMA (Code Division Multiple Access) or TDMA (Time Division Multiple Access). TDMA requires some time frame synchronization. CDMA, on the other hand, would need multiple orthogonal codes which would complicate transmitter circuit design and increase self-interference because of non-perfect orthogonality. We preferred TDMA switching.

With its unprecedented temporal resolution, ToA (Time of Arrival) based localization methods are the natural choice for localization in the case of UWB. ToA, however, requires temporal synchronization between the nodes. This can be achieved by the RTToA (Round Trip Time of Arrival) approach which means that sensors involved must be able to retransmit received signals.

Deployable nodes, placed at verified positions may act as location reference or anchor nodes for the localization of roaming nodes. Other nodes are spread around as static or moving observers. Static observers are well suited to observe moving objects since they can most easily distinguish between desired information and reflections from the static environment (clutter) by exploiting time variance. On the other hand, moving observers (or illuminators, since propagation phenomena are reciprocal) can collect information about static objects and environments (e.g. structure of walls). By applying coherent data fusion methods, moving nodes will create an image of the static propagation environment. This is a full multi-static approach which requires a number of widely distributed cooperating sensor nodes having precise (relative) location information and synchronization at least between subsets of sensors (e.g. between receivers or transmitters or both).

Synchronization issues can be relaxed if we construct a more complex node containing, e.g. one transmitting antenna (Tx) and two receiving (Rx) antennas. Such a sensor already

4 Will-be-set-by-IN-TECH 182 Ultra-Wideband Radio Technologies for Communications, Localization and Sensor Applications Cooperative Localization and Object Recognition in Autonomous UWB Sensor Networks <sup>5</sup>

constitutes a small-baseline bi-static radar, which somehow resembles the sensing capability of a bat and allows the estimation of both object range and direction (by using time difference of arrival, TDoA). We will also refer to these nodes as "bat-type sensors". Their advantage is that mutual coherent processing is restricted to one platform. So, if several of those nodes are cooperating, they can directly exchange range and DoA (direction of arrival) information, which we call "non-coherent cooperation" since no exact phase synchronization between nodes is required. There is, of course, also a mixed approach which may consist of non-synchronized illuminators and locally coherent (differential) observers. So, with regard to the capabilities of sensor nodes involved and to their mutual cooperation, we distinguish between three basic structures of sensor networks:

applies 512 time subsampling which relaxes ADC (analog-to-digital converter) bandwidth requirements and still allows real-time data recording with a measurement rate of 50 impulse

Cooperative Localization and Object Recognition in Autonomous UWB Sensor Networks 183

Data measured in real-time by static anchor nodes allow the extraction of information about time-varying objects and people within the inspected scenario. Real-time data measured by moving nodes of the network allow the extraction of information about the geometric structure of the environment. In order to develop data extraction algorithms in parallel with the development of the demonstrator, we simulated a test scenario in terms of electromagnetic

As we did not have everything ready for the demonstration scenario, e.g. the robot was not available, a scenario was simulated using the ray tracing tool 3, in order to develop and test

The simulated scenario is shown in Fig. 3. It consists of a room of the size 9x8x4 m with a pillar in its center. There are six different objects (shown in Fig 3 on the right side) distributed in the room. Their positions are marked in the figure. Metal is chosen as material for the walls

The bat-type sensor follows the track indicated by the dotted line. The sensor is equipped with one transmit and two receive antennas. The transmit antenna is located at the point (0,0,1) of the local Cartesian coordinate system of the robot (with x being the moving direction, y, and z the height) and the receive antennas at (0,0.5,1) and (0,-0.5,1). At 78 positions, indicated by the red circles on the track, the robot stops and takes measurements (the channel is simulated accordingly). Therefore, the transceiver array rotates in the azimuth plane (x-y-plane) with an angle resolution of 3◦, so that at each position 120 simulations are performed. A horn antenna with a 3 dB beam-width of around 60◦ at 9 GHz was used as the radiation pattern for the transmit and receive antennas. The channel was simulated from 4.5 to 13.4 GHz in steps of

The inclusion of high-frequency aspects in terms of the transmission channel is particularly important for the accuracy of the simulation results. Otherwise, unrealistic conclusions will result from the simulation. In order to achieve a realistic evaluation of the efficiency of sensing and imaging systems, the channel model must describe the multi-path propagation realistically. A description of the used channel simulator as well as some verification

To prove robustness and applicability, research results obtained within CoLOR were experimentally validated in an extensive measurement campaign. The main focus was placed on a robust transfer of the algorithms from laboratory conditions to a more realistic indoor propagation scenario. The goals of these extensive multi-disciplinary measurements were

waves propagation. The simulated scenario is described in the following Section 2.2.

the algorithms for map building, localization and imaging independently. For the final demonstration, the scenario described in Section 2.3 was used.

6.25 MHz in order to describe the frequency selective behavior realistically.

• to verify UWB sensing algorithms in real indoor scenarios,

responses per second.

**2.2. Simulated scenario**

and for the six objects.

measurements are given in Section 3.

**2.3. Demonstrator**


**Figure 2.** Basic architecture of UWB sensors node.

The basic architecture of a bat-type sensor is described in Fig. 2, see [47], [29], and see Chapter 14. The transmit signal is a periodic M-sequence spread spectrum signal which is generated by a high speed 12-stage digital shift register. It is clocked by a stable RF-clock generator - a dielectric resonance oscillator operating at 7 GHz which allows a usable frequency band up to 3.5 GHz (baseband). The sequence contains 4095 chips corresponding to 585 ns duration and about 175 m of usable distance. In the passband mode the signal is up-/down-converted by the same frequency which results in usable frequency band from 3.5 GHz to 10.5 GHz, which corresponds to the well-known FCC-specification. The receiver applies 512 time subsampling which relaxes ADC (analog-to-digital converter) bandwidth requirements and still allows real-time data recording with a measurement rate of 50 impulse responses per second.

Data measured in real-time by static anchor nodes allow the extraction of information about time-varying objects and people within the inspected scenario. Real-time data measured by moving nodes of the network allow the extraction of information about the geometric structure of the environment. In order to develop data extraction algorithms in parallel with the development of the demonstrator, we simulated a test scenario in terms of electromagnetic waves propagation. The simulated scenario is described in the following Section 2.2.
