**5. Space navigation systems**

Common navigation technologies assume navigation on a surface with twodimension (2D), flat land area. Navigation in three-dimension (3D) is much more complicated requiring at least new technologies to complement the existing 2D navigation technologies.

#### **5.1 Autonomous navigation of micro aerial vehicles**

In this section we present a low-computational method for state estimator enabling autonomous flight of micro aerial vehicles [7]. All the estimation and control tasks are solved on board and in real time on a simple computational unit. The state estimator fuses observations from an inertial measurement unit, an optical flow smart camera, and a time-of-flight range sensor. The smart camera provides optical flow measurements and odometry estimation, avoiding the need for image processing, usable during flight times of several minutes. A nonlinear controller operating in the special Euclidean group SE(3) can drive, based on the estimated vehicle's state, a quadrotor platform in 3D space guaranteeing the asymptotic stability of 3D position and heading. The approach is validated through simulations and experimental result.

**21**

*Expanding Navigation Systems by Integrating It with Advanced Technologies*

**5.3 Space navigation using formation flying tiny satellites**

Weiss [8] developed a vision-based navigation system for micro helicopters operating in large and unknown environments. It is based on vision-based methods and a sensor fusion approach for state estimation and sensor self-calibration of sensors and with their different availability during flight. This is enabled by an onboard camera, real-time motion sensor, and vision algorithms. It renders the camera and an onboard multi-sensor fusion framework capable to estimate at the same time the vehicle's pose and the inter-sensor calibration for continuous operation. It runs at linear time to the number of key frames captured in a previously visited area. To maintain constant computational complexity, improve performance, and increase scalability and reliability, the computationally expensive vision part is replaced by

Traditional space positioning and navigation are based on large satellites flying in a semi-fixed orbit and so are costly and less flexible [9]. Recent developments of low-mass, low-power navigation sensors and the popularity of smaller satellites, a new approach of having many tiny spacecrafts flying in clusters under controlled configurations utilizing its cumulative power to perform necessary assignments. To keep stable but changeable configurations, positioning, attitude, and intersatellite navigation are used. For the determination of relative position and attitude between the formation flying satellites, Carrier-phase differential GPS (CDGPS) is used, where range coefficients, GPS differential corrections, and other data are exchanged among spacecrafts, enhancing the precision of the ranging and navigation functions. The CDGPS communicates the NAVSTAR GPS constellation to provide precise measures of the relative attitude, the positions between vehicles,

Pedestrian navigation services enable people to retrieve precise instructions to reach a specific location. As the spatial behavior of people on foot differs in many ways from the driver's performance, common concepts for car navigation services are not suitable for pedestrian navigation. Cars use paved roads with clear borderlines and road signs, and so keeping the car on track is its main role, neglecting obstacles and hazards, unless it is integrated with a social network. However, pedestrians, unlike like cars, may not follow the defined road. This makes personal navigation more complicated and forces us adding special features required for safe navigation. Pedestrian navigation requires very accurate, high-resolution, and realtime response [10]. Solely GPS does not support last moment route changes, such as road detours, significant obstacles, and safety requirements. However, integrating the IoT and GPS via an application generates a solution providing accurate and safe navigation. To enable it, a two-stage personal navigation system is used. In the first stage, the trail is photographed by a navigated drown, and the resulting video is saved in a cloud database. In the second stage, a mobile application is loaded to the pedestrian's mobile phone. Once the pedestrian is about to walk, it activates the mobile application which synchronizes itself with the cloud navigation database, and then instructions from the mobile phone guide the pedestrian along the trail-walk. A more advanced system contains the two stages within the mobile

*DOI: http://dx.doi.org/10.5772/intechopen.91203*

the final calculated camera pose.

and attitude in the formation.

**6. Pedestrian navigation systems**

**5.2 Vision-based navigation for micro helicopters**
