**3. The autonomous navigation for Mars exploration**

#### **3.1 The autonomous and the nonautonomous navigations in Mars exploration**

As we have stated, the Mars exploration mission can be divided into four phases, that is, the launch phase, the cruise flight phase, the EDL phase, and the roving phase. **Table 6** illustrates the comparisons between the autonomous navigation method and the nonautonomous navigation technique during four phases. In fact, both the autonomous navigation and the nonautonomous navigation methods are needed in the deep space exploration. In **Table 6**, regarding the nonautonomous navigation methods, the GPS, the Beidou satellite system, the GLONASS, and the Galileo system are the most famous satellite navigation systems in the world. The radio-based method can carry out the navigation by the means of the Deep Space Network (DSN)-based system [47] or its airborne radar equipment. The DSN is a land-based tracking, telemetry, and control network for the satellite navigation, control, and observation in the deep space. The giant radio emission equipment, the X-ray telescope, and some communication terminals are deployed in different sites on Earth or in orbit to assist the deep space navigation application.

The navigation in the EDL phase faces lots of challenges. First, the complex atmosphere and weather in Mars will create uncertainty to the navigation task. The attenuated atmosphere will cause the deficiency of the aerodynamic force [48]; no accurate environment model and aerodynamic model can be used; the gust and the sandstorm will always affect the land precision; and the descent speed is fast, and the blackout zone [49] still exists, which can affect the normal communication and so on. Second, the long-time space flight will rise changes and damages to the mechanical structure and the sensors [50] of prober. The cosmic ray, the extreme temperature, and the space debris will influence the health state of prober. Third, the time delay, which is larger than 10 minutes from Earth to Mars, also indicates that people cannot monitor its flight state in real time. To conquer these problems to some extent, the autonomous navigation can be used during the EDL phase. In the entry stage, both the inertial navigation and the celestial navigation can be used; in the descent stage, the inertial navigation can be considered; and in the landing stage, the inertial navigation and the visual navigation can all be employed.

The visual navigation in the roving phase also has some problems. First, since lots of optic devices are utilized, the complex environment light and the dust in air will influence their working states [51] seriously. Because the Mars surface has no liquid water and no vegetation, the wind with the sand can be commonly met. The lens of imaging sensor may be contaminated. Second, the low surface temperature in Mars also is a problem. Most of storage batteries cannot work under a normal state in that low temperature. The strength of materials will also be changed. Third, the poor performance of the imaging sensor can determine the limited computational effect of the image processing algorithm. To control the weight of launch load, the high performance camera with big size cannot be sent to Mars. Thus, how to use the images come from the low resolution imaging sensor for the data analysis

**21**

*Autonomous Navigation for Mars Exploration DOI: http://dx.doi.org/10.5772/intechopen.92093*

**No. The flight phase**

1 The launch phase

2 The cruise flight phase

3 The EDL phase

4 The roving phase

**Table 6.**

purpose is a challenge. To conquer these problems above, on one hand, the special materials and process will be used to improve the performance of the imaging sensor; on the other hand, some typical workflows and data fusion algorithm [52]

*The autonomous and the nonautonomous navigation methods in different flight phases of Mars exploration.*

The inertial navigation and the celestial

The inertial navigation and the celestial

The inertial navigation, the celestial navigation, and the visual navigation

The inertial navigation, the celestial navigation, and the visual navigation

navigation

navigation

**The autonomous navigation methods The nonautonomous** 

**navigation methods**

radio navigation

The satellite navigation and the

The radio navigation, and so on

The radio navigation, and so on

The radio navigation, and so on

Many navigation algorithms have been developed for the autonomous navigation application. Regarding the inertial navigation, lots of time series data can be collected from the accelerometer and the gyroscope. Because of the electrical radiation, the impact and the vibration, and the temperature drift, their measurement data are always contaminated by the system noise or the exterior noise. As a result, the denoising computation and the trending prediction are necessary. In general, the classic Kalman filter, the Extended Kalman Filter (EKF), the Unscented Kalman Filter (UKF) [53], and the particle filter [54] can be used to improve the data quality. The Kalman filter belongs to a kind of sequential filter technique, which needs no complex iterative computation. Its processing speed is fast, while its need of the data storage capacity is small. In other cases, the least square filter [55] can also be considered to solve the denoising issue. The least square filter is a kind of batch processing algorithm, which needs no priori information of the state variable; it has

The celestial navigation uses the imaging sensor to observe the typical celestial body. Lots of data processing methods for 2D data can be employed. For example, the dim target detection can be used to identify the Phobos and the Deimos of Mars in space; the image denoising [56] can be considered for the noise inhibition; the spectral analysis can be employed to assist the atmosphere component analysis; and the image quality evaluation can also be used for the celestial body recognition. The visual navigation also adopts the imaging sensor to percept the environment. Thus, the related image processing algorithms [57], such as the camera calibration, the image enhancement, the image restoration, the image denoising, the feature detection, the feature matching, the image mosaic, and the image segmentation, can all be utilized. Recently, the Simultaneous Localization and Mapping (SLAM) technique [58] begins to be used for the map navigation in the narrow area. Its familiar methods include the Lidar-based SLAM and the visual-based SLAM. This technique

can be used for the visual navigation in the Mars roving phase in future.

The integrated navigation can get the multivariate data; thus, the data fusion technique should be applied for its data processing. The data fusion can improve the data processing precision and reliability by the measurement of information reasoning and integration. In general, the data fusion algorithm includes the linear

should be designed according to the environment characters in Mars.

**3.2 The autonomous navigation algorithms**

been used to solve the orbit estimation issue currently.


**Table 6.**

*Mars Exploration - A Step Forward*

be improved definitely.

sensation, the olfactory sensation, and the tactile sensation can all be utilized in this method. Lost of new sensors can be developed for this navigation technique. Third, the Mars atmosphere model-based navigation [46] can also be considered. For example, a kind of Mars flush air data system has been used to assist the estimations of the dynamic pressure, the Mach number, and the impact angle of Mars prober. By using all the new navigation theories and methods, the final navigation accuracy can

**3.1 The autonomous and the nonautonomous navigations in Mars exploration**

that is, the launch phase, the cruise flight phase, the EDL phase, and the roving phase. **Table 6** illustrates the comparisons between the autonomous navigation method and the nonautonomous navigation technique during four phases. In fact, both the autonomous navigation and the nonautonomous navigation methods are needed in the deep space exploration. In **Table 6**, regarding the nonautonomous navigation methods, the GPS, the Beidou satellite system, the GLONASS, and the Galileo system are the most famous satellite navigation systems in the world. The radio-based method can carry out the navigation by the means of the Deep Space Network (DSN)-based system [47] or its airborne radar equipment. The DSN is a land-based tracking, telemetry, and control network for the satellite navigation, control, and observation in the deep space. The giant radio emission equipment, the X-ray telescope, and some communication terminals are deployed in different sites

As we have stated, the Mars exploration mission can be divided into four phases,

**3. The autonomous navigation for Mars exploration**

on Earth or in orbit to assist the deep space navigation application.

The navigation in the EDL phase faces lots of challenges. First, the complex atmosphere and weather in Mars will create uncertainty to the navigation task. The attenuated atmosphere will cause the deficiency of the aerodynamic force [48]; no accurate environment model and aerodynamic model can be used; the gust and the sandstorm will always affect the land precision; and the descent speed is fast, and the blackout zone [49] still exists, which can affect the normal communication and so on. Second, the long-time space flight will rise changes and damages to the mechanical structure and the sensors [50] of prober. The cosmic ray, the extreme temperature, and the space debris will influence the health state of prober. Third, the time delay, which is larger than 10 minutes from Earth to Mars, also indicates that people cannot monitor its flight state in real time. To conquer these problems to some extent, the autonomous navigation can be used during the EDL phase. In the entry stage, both the inertial navigation and the celestial navigation can be used; in the descent stage, the inertial navigation can be considered; and in the landing stage, the inertial navigation and the visual navigation can all be employed.

The visual navigation in the roving phase also has some problems. First, since lots of optic devices are utilized, the complex environment light and the dust in air will influence their working states [51] seriously. Because the Mars surface has no liquid water and no vegetation, the wind with the sand can be commonly met. The lens of imaging sensor may be contaminated. Second, the low surface temperature in Mars also is a problem. Most of storage batteries cannot work under a normal state in that low temperature. The strength of materials will also be changed. Third, the poor performance of the imaging sensor can determine the limited computational effect of the image processing algorithm. To control the weight of launch load, the high performance camera with big size cannot be sent to Mars. Thus, how to use the images come from the low resolution imaging sensor for the data analysis

**20**

*The autonomous and the nonautonomous navigation methods in different flight phases of Mars exploration.*

purpose is a challenge. To conquer these problems above, on one hand, the special materials and process will be used to improve the performance of the imaging sensor; on the other hand, some typical workflows and data fusion algorithm [52] should be designed according to the environment characters in Mars.

#### **3.2 The autonomous navigation algorithms**

Many navigation algorithms have been developed for the autonomous navigation application. Regarding the inertial navigation, lots of time series data can be collected from the accelerometer and the gyroscope. Because of the electrical radiation, the impact and the vibration, and the temperature drift, their measurement data are always contaminated by the system noise or the exterior noise. As a result, the denoising computation and the trending prediction are necessary. In general, the classic Kalman filter, the Extended Kalman Filter (EKF), the Unscented Kalman Filter (UKF) [53], and the particle filter [54] can be used to improve the data quality. The Kalman filter belongs to a kind of sequential filter technique, which needs no complex iterative computation. Its processing speed is fast, while its need of the data storage capacity is small. In other cases, the least square filter [55] can also be considered to solve the denoising issue. The least square filter is a kind of batch processing algorithm, which needs no priori information of the state variable; it has been used to solve the orbit estimation issue currently.

The celestial navigation uses the imaging sensor to observe the typical celestial body. Lots of data processing methods for 2D data can be employed. For example, the dim target detection can be used to identify the Phobos and the Deimos of Mars in space; the image denoising [56] can be considered for the noise inhibition; the spectral analysis can be employed to assist the atmosphere component analysis; and the image quality evaluation can also be used for the celestial body recognition. The visual navigation also adopts the imaging sensor to percept the environment. Thus, the related image processing algorithms [57], such as the camera calibration, the image enhancement, the image restoration, the image denoising, the feature detection, the feature matching, the image mosaic, and the image segmentation, can all be utilized. Recently, the Simultaneous Localization and Mapping (SLAM) technique [58] begins to be used for the map navigation in the narrow area. Its familiar methods include the Lidar-based SLAM and the visual-based SLAM. This technique can be used for the visual navigation in the Mars roving phase in future.

The integrated navigation can get the multivariate data; thus, the data fusion technique should be applied for its data processing. The data fusion can improve the data processing precision and reliability by the measurement of information reasoning and integration. In general, the data fusion algorithm includes the linear weighting-based method, the filtering with a Bayesian estimation-based method [59], the factor graph-based method [60], and the interactive multiple model fusion-based method. Among these methods, the linear weight-based method uses the sum of the weighted input data to carry out the fusion computation. The filtering with a Bayesian estimation-based method considers the prior information and the Bayesian statistical theory to implement the fusion calculation. The factor graph-based method employs the Bayesian network or the Markov model [61] to realize the data fusion. In addition, the interactive multiple model fusion-based technique uses two or more than two filters or the prediction model to accomplish data fusion. Clearly, the computation of data fusion can be realized by the parallel algorithm or the sequential algorithm.
