**1. Introduction**

14 Will-be-set-by-IN-TECH

[9] Pescosolido, L., Barbarossa, S. & Scutari, G. [2008]. Radar sensor networks with

[10] Srinivasan, R. [1986]. Distributed radar detection theory, *IEE Proceedings-F* Vol. 133(No.

[11] Surender, S. C. & Narayanan, R. M. [2011]. Uwb noise-ofdm netted radar: Physical layer design and analysis, *IEEE Trans. Aerosp. Electron. Syst.* Vol. 47(No. 2): 1380–1400. [12] Varshney, P. K. [1997]. *Distributed Detection and Data Fusion*, 1st edn, Springer-Verlag [13] Win, M. Z. & Scholtz, R. A. [2000]. Ultra-wide bandwidth time-hopping spread-spectrum impulse radio for wireless multiple-access communications, *IEEE*

[14] Yang, Y., Blum, R. S. & Sadler, B. M. [2010]. Distributed energy-efficient scheduling for radar signal detection in sensor networks, *Proc. IEEE Int. Radar Conf.*, pp. 1094–1099.

distributed detection capabilities, *Proc. IEEE Int. Radar Conf.*, pp. 1–6.

*Trans. on Communications* Vol. 48(No. 4): 679–691.

1): 55–60.

Ultra-wideband (UWB) radio sensor networks promise interesting perspectives for emitter and object position localization, object identification and imaging of environments in short range scenarios. Their fundamental advantage comes from the huge bandwidth which could be up to several GHz depending on the national regulation rules. Consequently, UWB technology allows unprecedented spatial resolution in the geo-localization of active UWB radio devices and high resolution in the detection, localization and tracking of passive objects.

With the lower frequencies (< 100 Hz) involved in the UWB spectrum, looking into or through non-metallic materials and objects becomes feasible. This is of major importance for applications like indoor navigation and surveillance, object recognition and imaging, through wall detection and tracking of persons, ground penetrating reconnaissance, wall structure analysis, etc. UWB sensors preserve their advantages -high accuracy and robust operation- even in complicated, multipath rich propagation environments. Compared to optical sensors, UWB radar sensors maintain their capability and performance in situation where optical sensors collapse. They can even produce useful results in non-LOS (non-Line of Sight) situations by taking advantage of multipath.

Despite the excellent range resolution capabilities of UWB radar sensors, detection and localization performance can be significantly improved by the cooperation between spatially distributed nodes of a sensor network. This allows robust localization even in the case of partly obscured links. Moreover, distributed sensor nodes can acquire comprehensive knowledge of the structure of an unknown environment and construct an electromagnetic image which is related to the relative sensor-to-sensor node coordinate system. Distributed observation allows the robust detection and localization of passive objects and the identification of certain features of objects such as shape, material composition, dynamic parameters, and time-variant behavior. This all makes UWB a promising basis for the

©2013 Malz et al., licensee InTech. This is an open access chapter distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. ©2013 Malz et al., licensee InTech. This is a paper distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

2 Will-be-set-by-IN-TECH 180 Ultra-Wideband Radio Technologies for Communications, Localization and Sensor Applications Cooperative Localization and Object Recognition in Autonomous UWB Sensor Networks <sup>3</sup>

autonomous navigation of mobile sensor nodes, such as maneuverable robots- in an unknown environment that may arise as a result of an emergency situation.

moving trajectory. Humans and animals may also reveal themselves by time variant features

Cooperative Localization and Object Recognition in Autonomous UWB Sensor Networks 181

The organization of this chapter is as follows. Section 2.1 describes the architecture of the sensor network and basic parameters of UWB sensor nodes that we used to achieve our objectives. Section 2.2 specifies the simulated test scenario that was applied for the development of the localization and imaging algorithms. Simulated data allowed us to develop algorithms in parallel to the demonstrator, which is presented in Section 2.3. The demonstrator was used to assess performance and to evaluate the developed data extraction algorithms in realistic scenarios. The algorithms were developed and evaluated using data obtained from the UWB wave propagation simulator described in Section 3. Within the CoLOR project algorithms were developed for: the cooperative localization of sensor nodes, see Section 4, the evaluation of sensor network topology, see Section 5, simultaneous localization and map building, see Section 6, object parameter estimation, see Section 7, imaging of environment, see Section 8 and the detection and localization of moving objects, see Section 9. Special attention was given to algorithms that promise real-time capability.

To accomplish the tasks described above, the UWB sensor nodes used can be heterogeneous in terms of their sensing capabilities and mobility. Most simple nodes may act just as illuminators of the environment. This requires only transmitting operation, but no sensing and processing capability. However, multiple transmit signal access has to be organized. This could be CDMA (Code Division Multiple Access) or TDMA (Time Division Multiple Access). TDMA requires some time frame synchronization. CDMA, on the other hand, would need multiple orthogonal codes which would complicate transmitter circuit design and increase self-interference because of non-perfect orthogonality. We preferred TDMA switching.

With its unprecedented temporal resolution, ToA (Time of Arrival) based localization methods are the natural choice for localization in the case of UWB. ToA, however, requires temporal synchronization between the nodes. This can be achieved by the RTToA (Round Trip Time of Arrival) approach which means that sensors involved must be able to retransmit received

Deployable nodes, placed at verified positions may act as location reference or anchor nodes for the localization of roaming nodes. Other nodes are spread around as static or moving observers. Static observers are well suited to observe moving objects since they can most easily distinguish between desired information and reflections from the static environment (clutter) by exploiting time variance. On the other hand, moving observers (or illuminators, since propagation phenomena are reciprocal) can collect information about static objects and environments (e.g. structure of walls). By applying coherent data fusion methods, moving nodes will create an image of the static propagation environment. This is a full multi-static approach which requires a number of widely distributed cooperating sensor nodes having precise (relative) location information and synchronization at least between subsets of sensors

Synchronization issues can be relaxed if we construct a more complex node containing, e.g. one transmitting antenna (Tx) and two receiving (Rx) antennas. Such a sensor already

resulting from vital functions such as breathing.

**2. System architecture**

signals.

**2.1. Sensor network architecture**

(e.g. between receivers or transmitters or both).

The objective of the CoLOR project (Cooperative Localization and Object Recognition in Autonomous UWB Sensor Networks) was to develop and demonstrate new principles for localization, navigation and object recognition in distributed sensor networks based on UWB radio technology. The application scenario of the CoLOR project is described by mobile and deployable sensor nodes cooperating in an unknown or even hostile indoor environment without any supporting infrastructure as it may occur in emergency situations such as fire disasters, earthquakes or terror attacks. In this case, UWB can be used to identify hazardous situations such as broken walls, locate buried-alive persons, roughly check the integrity of building constructions, detect and track victims, etc. In this scenario, it is assumed that optical cameras and other sensors cannot be used. Data fusion of optical image information and UWB radar was not in the scope of this project.

**Figure 1.** CoLOR scenario.

A possible scenario is shown in Fig. 1. The unknown environment is first explored by autonomous robots that deploy fixed nodes at certain positions. Those nodes being able of transmitting and receiving shall play the role as anchor nodes. They span a local coordinate system and should be placed at "strategic" positions, i.e. they span a large volume and ensure a complete illumination of the environment. Moving nodes will localize themselves relative to this anchor node reference. Moreover, when moving they collect information about the structure of the environment by receiving reflected waves. This way, they build an "electromagnetic image" of the environment and recognize basic building structures ("landmarks"). Step by step a map of the environment is built which can be used as another reference for navigation. This procedure is already well known from autonomous robot navigation as SLAM (Simultaneous Localization and Mapping). However, here we do not consider optical methods but UWB to recognize the feature vector. If there are solitary objects, the moving sensor may scrutinize shape and material composition by circling around. Other objects like humans may walk around and create time-variant reflections that identify their moving trajectory. Humans and animals may also reveal themselves by time variant features resulting from vital functions such as breathing.

The organization of this chapter is as follows. Section 2.1 describes the architecture of the sensor network and basic parameters of UWB sensor nodes that we used to achieve our objectives. Section 2.2 specifies the simulated test scenario that was applied for the development of the localization and imaging algorithms. Simulated data allowed us to develop algorithms in parallel to the demonstrator, which is presented in Section 2.3. The demonstrator was used to assess performance and to evaluate the developed data extraction algorithms in realistic scenarios. The algorithms were developed and evaluated using data obtained from the UWB wave propagation simulator described in Section 3. Within the CoLOR project algorithms were developed for: the cooperative localization of sensor nodes, see Section 4, the evaluation of sensor network topology, see Section 5, simultaneous localization and map building, see Section 6, object parameter estimation, see Section 7, imaging of environment, see Section 8 and the detection and localization of moving objects, see Section 9. Special attention was given to algorithms that promise real-time capability.
