**2.2.1 Distributed target detection method in the Gaussian scale-space**

In the SAR images with high resolution, each target occupies several resolution units to form area target. So detecting the ship target in the SAR images with high resolution should regard the target as distributive target, and the assumption of point target under the traditional radar is not suitable any more. The project here proposes a distributed target detection method in the Gaussian scale-space. The distance relationship among the detected objects is adopted to identify the distributed target. In the situation of hardly estimating the background's scattering distribution or of low SNR (signal-to-noise ratio), this method can realize the distributive target detection more effectively than CFAR method.

#### **2.2.2 Ship size category estimation model**

In the SAR images with high resolution, the ship target can be divided into three categories according to their dimension, among which small ships(≤50m) are represented as point target, while the middle size ship(≤100m) as distributive target of single corner and big size ship(≥100m) as double corner distributive target. According to the relative positions of the corner and combining the orbit information (resolution ratio and incident angle) of the SAR image, the length, height and the direction of the ship will be worked out.

#### **2.2.3 Ship location correction and ship direction estimation method**

The imaging geometry of the SAR imagery is slant-range projection. So, due to the geometric distortions, such as layover, foreshortening, and shadows, there exits measurement error between the observed position and its actual location. In this method, using the directional texture of the wake of ship, the convergent point of the wake pattern can be calculated, which is the actual location of the stern. Then the position correcting parameters can be worked out.

A bow wave is the wave that forms at the bow of a ship when it moves through the water. As the bow wave spreads out, it defines the outer limits of a ship's wake. Theoretically, the convergent point of the bow wave' outline must at the extended line of the ship's heading. The vanishing point can be calculated by the Hough transform, and the heading direction of the ship can be calculated according to the coordinates of the bow and the stern.

#### **2.2.4 Wave direction estimation based on the partial energy direction**

Wave direction estimation can be used to analyze the sea state of the target area and supply basis for search and rescue area decision. This method is set up on the basis of steerable filter, which is a filter set composed of an even-symmetric filter and an odd-symmetric filter. When the orthogonal filter set is rotated to the same orientation of the local texture, the oriented energy reaches its maximum. The orientation corresponding to the maximum

Remote Sensing Application in the Maritime Search and Rescue 385

In order to departure the ship target from the grey scale space, we define a Gaussian comparison function *eb*(*x*). If the responses of *f*(*x*) in Gaussian scale-space is *r*(*x*,, *w*, *h*), then

> 1, ( , , , ) ( ), and ( ) 0, other

*rx wh f x x w e x* 

Then, the detection of the singular target is implemented by way of constant false alarm

Set the SAR image as a N N R A matrix, where NR and NA represent the dimension of range and azimuth respectively, and use i ,i , ,i R1 R2 RK as range coordinate and

where means norm. Equation(4) represents the relative distance between two scattered units *j* and *k*, and it expresses the location relations among each target pixel point after the

When the range and the azimuth are considered separately, the distance can be defined as

Due to the geometric distortions of the SAR image, the ship hull and the mast will occupy several resolution units in the image. Set a candidate ship target *M*<sup>0</sup> ( *M*0 includes the ship hull and the mast), where *M*0*<sup>R</sup>* and *M*0*<sup>A</sup>* represent the size of M0 in the range direction and the azimuth direction respectively. Set a bounding box with size of M M 0R 0A , which is the smallest rectangle containing the detecting ship. Assume that the size of the detecting target

Regarding to the distributed targets, at least *M* scattering points will be detected after the second detection step, and then *M* can be defined as the distance threshold *th d* within the

> *th* 0 0 *R A dMM M*

(0,1) is the confident coefficient and it is determined by the empirical data of the

(3)

*djk i j k jj K* ( , ) , , 1,2, , 1 *j k* (4)

*d*(,) , *j k i ii i Rk Rj Ak Aj* (5)

*R A S S M M*

*R A*

(6)

*R A* .

2

*<sup>b</sup>*

**3.1.2 Distributed target detection based on location-dependent** 

i ,i , ,i A1 A2 AK as azimuth coordinate separately for convenience.

is S S R A and the radar resolution is *R A* , then 0 0

The detected relative distance among different scattered units can be defined.

<sup>2</sup> <sup>2</sup> ( ) *x x x e dt* 

.

Gaussian scale-space detection.

distributed target.

Here 

radar echo.

Among them,

we have:

rate(CFAR).

oriented energy is defined as the dominant orientation of the local energy at that point. And the main direction of the wave can be estimated.

#### **2.2.5 The registration algorithm of SAR image and nautical chart based on Gaussian principle curve**

To ensure precise detection and location of the distressed ship in the MSR, the navigational chart and the remote sensing image should be matched beforehand. Because the SAR image and the electronic chart are data from different sensors, the content and intensities of these images are much different from each other. Coastline is a stable and reliable feature for navigation in coast area. However, the deformation between edges extracted from different signals may produce position errors, and the noise in radar signals may greatly influence the edge extraction result. And how to obtain reliable control-points and how to obtain the correct correspondence are the key issues in the registration algorithm. In this chapter, a multi-scaled registration algorithm for SAR image and electronic chart is presented. Based on the scale-space theory, coastlines from the two images are matched in both frequency domain and image domain with continuous scale level.

#### **3. Algorithms and methods in the maritime search and rescue**

#### **3.1 Distributed target detection method in the Gaussian scale-space**

In the high resolution SAR images, the large-scale ship and super large-scale ship (>100 meters) are presented as distributed targets. In this algorithm, these targets are detected by the distance relationship between the echo intensity of the masthead light and the ship hull. The detection of distributed target based on location-dependent information can be completed by two-step detection. In the first step, the ship target is characterized in the Gaussian scale-space. This transforms the signal range value into binary. Then, the detection of the singular objects is implemented using constant false alarm rate(CFAR), and record the location of the pixel point whose value is 1. The second step aims at finding the distributed target from the result of the first step detection by applying location-dependent information.

#### **3.1.1 Ship characteristic description**

Under the ideal condition, in the *z*(*x*) of the image which has its background gray value as 0, there is a maxima value *h* and strong scattered point *fb*(*x*) with width as *w*. Considering the edge effect of the image, we build the mathematical model as follows:

$$f\_b(\mathbf{x}) = \begin{cases} h(1 - \left(\mathbf{x}/w\right)^2), & |\mathbf{x}| \le w \\ 0, & |\mathbf{x}| > w \end{cases} \tag{1}$$

The responses of these spot-like targets in Gaussian scale space are presented as *rb*(*x*,,*w*,*h*):

$$\begin{aligned} r\_b(\mathbf{x}, \sigma, w, h) &= g\_{\sigma}(\mathbf{x}) \ast f\_b(\mathbf{x}) \\ &= \frac{h}{w^2} [(w^2 - \mathbf{x}^2 - \sigma^2)(\phi\_\sigma(\mathbf{x} + w) - \phi\_\sigma(\mathbf{x} - w)) \\ &- 2\sigma^2 \mathbf{x} (g\_{\sigma}(\mathbf{x} + w) - g\_{\sigma}(\mathbf{x} - w)) - \sigma^4 (g\_{\sigma}'(\mathbf{x} + w) - g\_{\sigma}'(\mathbf{x} - w))] \end{aligned} \tag{2}$$

$$\text{Among them, }\,\phi\_{\sigma}(\mathbf{x}) = \int\_{-\infty}^{\mathbf{x}} e^{-\frac{\mathbf{x}^{2}}{2\sigma^{2}}} dt\text{ .}$$

oriented energy is defined as the dominant orientation of the local energy at that point. And

**2.2.5 The registration algorithm of SAR image and nautical chart based on Gaussian** 

To ensure precise detection and location of the distressed ship in the MSR, the navigational chart and the remote sensing image should be matched beforehand. Because the SAR image and the electronic chart are data from different sensors, the content and intensities of these images are much different from each other. Coastline is a stable and reliable feature for navigation in coast area. However, the deformation between edges extracted from different signals may produce position errors, and the noise in radar signals may greatly influence the edge extraction result. And how to obtain reliable control-points and how to obtain the correct correspondence are the key issues in the registration algorithm. In this chapter, a multi-scaled registration algorithm for SAR image and electronic chart is presented. Based on the scale-space theory, coastlines from the two images are matched in both frequency

In the high resolution SAR images, the large-scale ship and super large-scale ship (>100 meters) are presented as distributed targets. In this algorithm, these targets are detected by the distance relationship between the echo intensity of the masthead light and the ship hull. The detection of distributed target based on location-dependent information can be completed by two-step detection. In the first step, the ship target is characterized in the Gaussian scale-space. This transforms the signal range value into binary. Then, the detection of the singular objects is implemented using constant false alarm rate(CFAR), and record the location of the pixel point whose value is 1. The second step aims at finding the distributed target from the result of the first step detection by applying location-dependent information.

Under the ideal condition, in the *z*(*x*) of the image which has its background gray value as 0, there is a maxima value *h* and strong scattered point *fb*(*x*) with width as *w*. Considering the

> <sup>2</sup> (1 ( ) ), | | ( ) 0, | | *<sup>b</sup> h xw x w f x*

The responses of these spot-like targets in Gaussian scale space are presented as *rb*(*x*,,*w*,*h*):

2 ( ( ) ( )) ( ( ) ( ))]

*xg x w g x w g x w g x w*

   

 

*x w*

(1)

(2)

the main direction of the wave can be estimated.

domain and image domain with continuous scale level.

**3.1.1 Ship characteristic description** 

**3. Algorithms and methods in the maritime search and rescue 3.1 Distributed target detection method in the Gaussian scale-space** 

edge effect of the image, we build the mathematical model as follows:

22 2

(, , ,) () ()

*b b r x wh g x f x*

2 4

[( )( ( ) ( ))

*w x xw xw*

2

*h*

*w*

**principle curve** 

In order to departure the ship target from the grey scale space, we define a Gaussian comparison function *eb*(*x*). If the responses of *f*(*x*) in Gaussian scale-space is *r*(*x*,, *w*, *h*), then we have:

$$e\_b(\mathbf{x}) = \begin{cases} 1, & r(\mathbf{x}, \sigma, w, h) \le f(\mathbf{x}), \text{ and } \|\mathbf{x}\| \le w \\ 0, & \text{other} \end{cases} \tag{3}$$

Then, the detection of the singular target is implemented by way of constant false alarm rate(CFAR).

#### **3.1.2 Distributed target detection based on location-dependent**

Set the SAR image as a N N R A matrix, where NR and NA represent the dimension of range and azimuth respectively, and use i ,i , ,i R1 R2 RK as range coordinate and i ,i , ,i A1 A2 AK as azimuth coordinate separately for convenience.

The detected relative distance among different scattered units can be defined.

$$d(j,k) = \left\|i\_j - j\_k\right\|, k > j, j = 1, 2, \cdots, K - 1 \tag{4}$$

where means norm. Equation(4) represents the relative distance between two scattered units *j* and *k*, and it expresses the location relations among each target pixel point after the Gaussian scale-space detection.

When the range and the azimuth are considered separately, the distance can be defined as

$$d(j\_\prime k) = \left( \left| \dot{\mathbf{i}}\_{Rk} - \dot{\mathbf{i}}\_{Rj} \right| \left| \dot{\mathbf{i}}\_{Ak} - \dot{\mathbf{i}}\_{Aj} \right| \right) \tag{5}$$

Due to the geometric distortions of the SAR image, the ship hull and the mast will occupy several resolution units in the image. Set a candidate ship target *M*<sup>0</sup> ( *M*0 includes the ship hull and the mast), where *M*0*<sup>R</sup>* and *M*0*<sup>A</sup>* represent the size of M0 in the range direction and the azimuth direction respectively. Set a bounding box with size of M M 0R 0A , which is the smallest rectangle containing the detecting ship. Assume that the size of the detecting target

is S S R A and the radar resolution is *R A* , then 0 0 *R A R A S S M M R A* .

Regarding to the distributed targets, at least *M* scattering points will be detected after the second detection step, and then *M* can be defined as the distance threshold *th d* within the distributed target.

$$d\_{\rm th} = M = \mu M\_{\rm OR} \times M\_{\rm OA} \tag{6}$$

Here (0,1) is the confident coefficient and it is determined by the empirical data of the radar echo.

Remote Sensing Application in the Maritime Search and Rescue 387

IS1 104.8 187.2—292.0 15.0—22.9 IS2 104.8 242.0—346.9 19.2—26.7 IS3 81.5 337.2—418.7 26.0—31.4 IS4 88.1 412.0—500.1 31.0—36.3 IS5 64.2 490.4—554.6 35.8—39.4 IS6 70.1 549.7—619.8 39.1—42.8 IS7 56.5 614.7—671.1 42.5—45.2

Build a space coordinates as Fig.1 with the true north (Y axis) as the reference direction. The

between the bow and the North), and then a ship model can be descripted by a set of

ENVISAT ASAR it is a right view image. Set radar incident angle as *θ*, the azimuthal angle as *β* (the angle between the satellite and the true north) and the height as *H*, which is shown as in Fig.1. According to SAR imaging mechanism, the shadow in the figure shows the ship's scattering image, which is determined by the ship's ground position, shape and the radar's scattering orientation. Assume that the ship hull scattering length is *s* and the width is *d*. The

> *w b* cos( )

> > <sup>2</sup> *b d* tan

<sup>2</sup> *w d* tan cos( ) 

Here *w* is the ship width, *b* is the width of the ship at the radar range, *d* is the scattering

*s l* cos( ) 

cos( ) *<sup>s</sup> <sup>l</sup>* 

<sup>2</sup> *w d* tan cos( ) 

Here *l* is the ship length and *s* is the scattering length of the ship target images at the radar

According to the length of the ship ships can be divided into small-sized (<50 meters), midsized (50-100 meters), big-sized (100-200 meters) and extra big-sized ship (>200 meters).

 

is the bow azimuthal angle.

width of hull target,*θ* is the radar incident angle, *β* is the radar azimuthal angle and

 

ship length is *l*, and the width is *w*, height is *h*, azimuthal angle of the bow is

Substellar Point (km) Incident Angle (º)

. For example, if the satellite is descending, for

(10)

(11)

(the angle

is the

Image Position Breadth (km) Distance to the

Table 2. Corresponding incident angle range of ENVISAT ASAR

geometric parameters as *P lwh* [ ,, , ]

bow azimuthal angle. The azimuth is

range, *β* is the radar azimuthal angle and

range is

Define the size of the distributed target *Ts* as:

$$T\_s = d(M\_{0'}(0,0))\tag{7}$$

Equation (7) defines the distance from the position of the bounding box M0 to the origin point (0,0). The distance describes the size of the detecting target, which is related to the form of the distance definition.

According to the size of the bounding box *M*<sup>0</sup> and Equation (6), the number of scattered points (marked as *u*) of each target reference window and their locations (i.e. target location) can be calculated as follow.

$$\begin{array}{ll} \mu \ge d\_{\text{flv}} & \text{target} \\ \mu < d\_{\text{flv}} & \text{non} - \text{target} \end{array} \tag{8}$$

Under the definition in Equation (7), ( , ) *position k dT i* and *Ts* are all two-dimension data, we need define its output. Set (,) *T TT s sR sA* , *d T i dd* ( ,)( , ) *position k R A* , and then the export definition of ( , ) *dT i T position k s* is:

$$\mathbf{d} = (\mathbf{T}\_{\text{position}}, \mathbf{i}\_{\text{k}}) > \mathbf{T}\_{\text{s}} = \begin{cases} \mathbf{1}, & \text{other} \\ \mathbf{0}, & \mathbf{d}\_{\text{R}} \le \mathbf{T}\_{\text{sR}} \text{ and } \mathbf{d}\_{\text{A}} \le \mathbf{T}\_{\text{sA}} \end{cases} \tag{9}$$

#### **3.1.3 Algorithm realization**

According to the above description of the algorithm, to design the integrated detection algorithm based on two-dimension location-dependent information, the procedure is shown as follows:


and renew the central position of the current target ( 1) *position k position uT i T u* ;


#### **3.2 Ship size category estimation model**

In the SAR images with high resolution, the ship target can be divided into two categories according to their dimension. The small ships are represented as single corner target while big ships as double corner distributed targets. According to the relative plane location of the double corner and combining the orbit information (resolution ratio and incident angle) of SAR images, the parameters of the ship's status, such as the length, height and the heading direction can be calculated. Table 2 shows the corresponding incident angle of ENVISAT ASAR.

Equation (7) defines the distance from the position of the bounding box M0 to the origin point (0,0). The distance describes the size of the detecting target, which is related to the

According to the size of the bounding box *M*<sup>0</sup> and Equation (6), the number of scattered points (marked as *u*) of each target reference window and their locations (i.e. target location)

> , target , non target

Under the definition in Equation (7), ( , ) *position k dT i* and *Ts* are all two-dimension data, we need define its output. Set (,) *T TT s sR sA* , *d T i dd* ( ,)( , ) *position k R A* , and then the export

> 1, other d (T ,i ) T 0, d T and d T

According to the above description of the algorithm, to design the integrated detection algorithm based on two-dimension location-dependent information, the procedure is shown

2. If ( , ) *dT i T position k s* , the number of pixels contained in the current target is 1 *u u* ,

3. If ( , ) *dT i T position k s* , then create a new target with its central pixel location as

In the SAR images with high resolution, the ship target can be divided into two categories according to their dimension. The small ships are represented as single corner target while big ships as double corner distributed targets. According to the relative plane location of the double corner and combining the orbit information (resolution ratio and incident angle) of SAR images, the parameters of the ship's status, such as the length, height and the heading direction can be calculated. Table 2 shows the corresponding incident angle of ENVISAT

1. Initially set 1 *k* , 1 *u* , and the position of the target centre is *T i position k* ;

and renew the central position of the current target ( 1) *position k*

*th th*

*u d*

position k s

<sup>0</sup> ( ,(0,0)) *T dM <sup>s</sup>* (7)

*u d* (8)

R sR A sA

*position*

*T*

*uT i*

*u*

;

(9)

Define the size of the distributed target *Ts* as:

form of the distance definition.

can be calculated as follow.

definition of ( , ) *dT i T position k s* is:

**3.1.3 Algorithm realization** 

*T i position k* , 1 *u* ;

4. Traversal searches the image.

**3.2 Ship size category estimation model** 

as follows:

ASAR.


Table 2. Corresponding incident angle range of ENVISAT ASAR

Build a space coordinates as Fig.1 with the true north (Y axis) as the reference direction. The ship length is *l*, and the width is *w*, height is *h*, azimuthal angle of the bow is (the angle between the bow and the North), and then a ship model can be descripted by a set of geometric parameters as *P lwh* [ ,, , ] . For example, if the satellite is descending, for ENVISAT ASAR it is a right view image. Set radar incident angle as *θ*, the azimuthal angle as *β* (the angle between the satellite and the true north) and the height as *H*, which is shown as in Fig.1. According to SAR imaging mechanism, the shadow in the figure shows the ship's scattering image, which is determined by the ship's ground position, shape and the radar's scattering orientation. Assume that the ship hull scattering length is *s* and the width is *d*. The range is

$$\begin{aligned} w &= b \cos(\beta - a) \\\\ b &= d \tan^2 \theta \\\\ w &= d \tan^2 \theta \cos(\beta - a) \end{aligned} \tag{10}$$

Here *w* is the ship width, *b* is the width of the ship at the radar range, *d* is the scattering width of hull target,*θ* is the radar incident angle, *β* is the radar azimuthal angle and is the bow azimuthal angle. The azimuth is

$$\begin{aligned} s &= l \cos(\beta - a) \\\\ l &= \frac{s}{\cos(\beta - a)} \\\\ w &= d \tan^2 \theta \cos(\beta - a) \end{aligned} \tag{11}$$

Here *l* is the ship length and *s* is the scattering length of the ship target images at the radar range, *β* is the radar azimuthal angle and is the bow azimuthal angle.

According to the length of the ship ships can be divided into small-sized (<50 meters), midsized (50-100 meters), big-sized (100-200 meters) and extra big-sized ship (>200 meters).

Remote Sensing Application in the Maritime Search and Rescue 389

In the wake area, calculate the textural direction pixel by pixel, and set the pixels whose textural directions belong to the wake direction interval as the wake points. Use the leastsquare method to calculate the mean direction of the wake, and the result is defined as the

Theoretically, a wake is the region of disturbed flow immediately behind the ship, and the texture of the wake pattern should converge to a point which is the real position of the stern. Collect the wake points in the wake region. Use the Hough transform to calculate the vanishing point of the wake texture. And this point is regarded as the actual location of the

According to the actual position of the stern and its image position, the projecting offset can be calculated. Use the stern offset, the projecting model is built. After calibration, the original detected result can be modified and the actual location of the ship target can be

A bow wave is the wave that forms at the bow of a ship when it moves through the water. As the bow wave spreads out, it defines the outer limits of a ship's wake. Theoretically, the convergent point of the bow wave' outline must be on the extending line of the ship's heading. The convergent point can be obtained by the Hough transform, and the heading direction of the ship can be calculated by the positions of the bow and the stern. Using the orbit information (resolution ratio and incident angle) of SAR images, the length, height and the direction of the ship can be worked out. The heading direction vector is defined in accordance with the bow wave's outline, which points from the actual location of the stern to the convergent point of the bow wave. Then, the heading direction is defined as the

direction between the heading direction vector and the true north in anti-clockwise.

coastline in the chart separately in the frequency field and the spatial field.

**3.4 The registration algorithm of SAR image and nautical chart based on Gaussian** 

To implement registration to remote sensing images with navigational radar image and the chart, the detection results will be directly showed both on the remote sensing image and the chart, and then do contrast verification among the remote sensing detection results and the data of radar and AIS. A multi-scale matching algorithm of radar image and chart is proposed in this project, transforming the coastline into a set of smooth curves in the Gaussian scale space, and making coarse to fine image registration to radar image and the

In the extracting coastlines, many near-shore objects such as ships and navigation marks that also have strong echoes may be merged by mistakes, leading to *Ω*-shaped spurs of the

**3.3.1.2 Course calculation** 

stern.

obtained.

**principle curve** 

**3.4.1 Curve feature representation** 

direction of the ship's wake, i.e. the course.

**3.3.1.3 Actual lactation of the stern** 

**3.3.1.4 Projection offset calculation** 

**3.3.2 Heading direction estimation method** 

Fig. 1. The ship geometric projection model in the SAR image

#### **3.3 Ship positioning correction and direction estimation**

#### **3.3.1 Ship positioning correction**

Due to the geometric distortions of the SAR imaging, there exits measurement error between the observed position and its actual location. With this method here we can get the real position of the stern by way of detecting the ship's wake region with directional texture and calculating the convergent point of the ship's wake. Thereby, the position correcting parameter will be worked out. This method enables us to get the information of wave-making of the distressed ship when the wake profile is invisible or in violent sea status, which improves the flexibility and automaticity for understanding the marine remote sensor images.

#### **3.3.1.1 Wake region determination**

A wake is the directional texture formed on the water surface immediately behind the ship. Therefore, the mean direction of the wake is consistent with the ship heading. In this algorithm, regarding the ship as the center, we divide the sea surface of the ship's neighborhood into several partially overlapped sectors. Define the angle between the edge of the sector and the positive direction of X axis as the edge direction angle of the sector, and the two edge direction angles formed by two edges of sectors are defined the direction range of the sector areas, and the median of the two edge angles is the main direction of the sector area. Calculate the textural energy of the main direction in each sector area of the ship, and set the sector area with the most textural energy as the wake areas. The direction range of the sector is regarded as the wake direction range, and the target area close to the wake range is the stern in the image.

#### **3.3.1.2 Course calculation**

388 Remote Sensing – Applications

Due to the geometric distortions of the SAR imaging, there exits measurement error between the observed position and its actual location. With this method here we can get the real position of the stern by way of detecting the ship's wake region with directional texture and calculating the convergent point of the ship's wake. Thereby, the position correcting parameter will be worked out. This method enables us to get the information of wave-making of the distressed ship when the wake profile is invisible or in violent sea status, which improves the

A wake is the directional texture formed on the water surface immediately behind the ship. Therefore, the mean direction of the wake is consistent with the ship heading. In this algorithm, regarding the ship as the center, we divide the sea surface of the ship's neighborhood into several partially overlapped sectors. Define the angle between the edge of the sector and the positive direction of X axis as the edge direction angle of the sector, and the two edge direction angles formed by two edges of sectors are defined the direction range of the sector areas, and the median of the two edge angles is the main direction of the sector area. Calculate the textural energy of the main direction in each sector area of the ship, and set the sector area with the most textural energy as the wake areas. The direction range of the sector is regarded as the wake direction range, and the target area close to the wake

flexibility and automaticity for understanding the marine remote sensor images.

Fig. 1. The ship geometric projection model in the SAR image

**3.3 Ship positioning correction and direction estimation** 

**3.3.1 Ship positioning correction** 

**3.3.1.1 Wake region determination** 

range is the stern in the image.

In the wake area, calculate the textural direction pixel by pixel, and set the pixels whose textural directions belong to the wake direction interval as the wake points. Use the leastsquare method to calculate the mean direction of the wake, and the result is defined as the direction of the ship's wake, i.e. the course.

#### **3.3.1.3 Actual lactation of the stern**

Theoretically, a wake is the region of disturbed flow immediately behind the ship, and the texture of the wake pattern should converge to a point which is the real position of the stern. Collect the wake points in the wake region. Use the Hough transform to calculate the vanishing point of the wake texture. And this point is regarded as the actual location of the stern.

#### **3.3.1.4 Projection offset calculation**

According to the actual position of the stern and its image position, the projecting offset can be calculated. Use the stern offset, the projecting model is built. After calibration, the original detected result can be modified and the actual location of the ship target can be obtained.

#### **3.3.2 Heading direction estimation method**

A bow wave is the wave that forms at the bow of a ship when it moves through the water. As the bow wave spreads out, it defines the outer limits of a ship's wake. Theoretically, the convergent point of the bow wave' outline must be on the extending line of the ship's heading. The convergent point can be obtained by the Hough transform, and the heading direction of the ship can be calculated by the positions of the bow and the stern. Using the orbit information (resolution ratio and incident angle) of SAR images, the length, height and the direction of the ship can be worked out. The heading direction vector is defined in accordance with the bow wave's outline, which points from the actual location of the stern to the convergent point of the bow wave. Then, the heading direction is defined as the direction between the heading direction vector and the true north in anti-clockwise.

#### **3.4 The registration algorithm of SAR image and nautical chart based on Gaussian principle curve**

To implement registration to remote sensing images with navigational radar image and the chart, the detection results will be directly showed both on the remote sensing image and the chart, and then do contrast verification among the remote sensing detection results and the data of radar and AIS. A multi-scale matching algorithm of radar image and chart is proposed in this project, transforming the coastline into a set of smooth curves in the Gaussian scale space, and making coarse to fine image registration to radar image and the coastline in the chart separately in the frequency field and the spatial field.

#### **3.4.1 Curve feature representation**

In the extracting coastlines, many near-shore objects such as ships and navigation marks that also have strong echoes may be merged by mistakes, leading to *Ω*-shaped spurs of the

Remote Sensing Application in the Maritime Search and Rescue 391

The image registration technique based on Fourier-Mellin transform finds its applications in many different fields thanks to its high accuracy, robustness and low computational cost. It can be used to register images which are misaligned due to rotation, scaling and translation.

> (,) ( , )

 <sup>0</sup> ( ) ( , ) (,) *x y <sup>j</sup> w xw <sup>y</sup>*

\* ( ) 1 2

*x y F uvF uv <sup>j</sup> w xw <sup>y</sup> <sup>e</sup>*

*sxy r x y x x y y* ( , ) [ ( cos sin ) , ( sin cos ) ]

And *sxy* (,) will gain a two-dimensional pulse at the position of (,) *x y* in the (,) *x y* space. Then, the relation between the corresponding Fourier transform of *sxy* (,) and *rxy* (,) is:

(,) 1 1 ( , ) [ ( cos sin ), ( sin cos )] *<sup>s</sup> j uv suv e R u v u v*

2 1 <sup>1</sup> ( , ) [ ( cos sin ), ( sin cos )] *suv R u v*

Then, the rotation angle and the scaling factor can be calculated in the log-polar coordinates.

1 1 *s r p p* ( ,log ) ( ,log log )

parameters by taking the curve as a whole, which takes the advantage of low computation cost and a good ability of noise immunity. This procedure is repeated in the Gaussian scalespace with a set of decreasing observing scales, and the two images are registered from rough to precise. And the transformation parameters are evaluated by clustering based on

 

 

) coordinates. The phase-correlation method computes the transformation

 

 

 

And the image translation can be estimated by the cross-spectrum of the two images.

Assuming *sxy* (,) is transformed image of *rxy* (,) after translation (,) *x y* , rotation

 

\* 1 2

(,) (,) (,) (,)

*<sup>x</sup> <sup>y</sup>* F *f xy Fw w* (13)

*x y* F *f x xy y Fw w e* (14)

0

 

*F uvF uv* (15)

 (16)

> 

 

 (19)

> 

 

 *u v* (18)

in the Hemi-Polar-Log

and

(17)

The basic theory for translation estimation is the Fourier shift theorem. Denote

**3.4.2 Registration** 

scaling

( ,lo g 

the evidence theory.

**3.4.2.1 Coarse registration in the phase domain** 

which is the Fourier transform of *f xy* (,) . Then

(in both *x* and *y* directions).

And the corresponding amplitude spectrum is:

And evidently, *<sup>p</sup>*<sup>1</sup> *s* will gain a 2-D pulse at (,)

coastlines. In order to reduce the influence caused by this kind of noises, a geometric criterion is proposed to avoid selecting initial seeds on spurs. To find proper seed, each candidate seed on the rough coastline is considered by means of judging the angle between the candidate seed and its adjacent selected seed from a certain point on the land, which is the mirror of the radar image center, i.e., the own-ship position. This procedure is illustrated in Fig.2. Search for the follow-up seed to seed *<sup>i</sup> v* in the counter clockwise direction along the initial coastline, where *O* is the own-ship position, and *O* is its mirror point perpendicular to the course. Judge the angle *<sup>i</sup>* between *<sup>i</sup> v* and the candidate point <sup>1</sup> *<sup>i</sup> v* from *O* . Spur may occur when *<sup>i</sup>* is small or even negative. Bypass those kinds of points <sup>1</sup> *<sup>i</sup> v* until meet a point *i*<sup>1</sup> *v* whose angle *<sup>i</sup>* with *<sup>i</sup> v* is larger than a predefined threshold.

Fig. 2. Dispose the burr noise of radar coastline

Then, a family of smoothed coastlines is derived in the Gaussian scale-space, as shown in Fig.3. Scale-space is a special type of multi-scale representation that comprises a continuous scale parameter and preserves the same spatial sampling at all scales.

$$L(\cdot; \sigma) = \lg(\cdot; \sigma) \* f \tag{12}$$

where the Gaussian kernel is 2 2 *gx x* ( ) [exp( 2 )] 2 . Because the coastline in the electronic chart is rather smooth, the scale-space derivation is only done for the SAR image.

Fig. 3. Continuous smooth coastline in different scale spaces

#### **3.4.2 Registration**

390 Remote Sensing – Applications

coastlines. In order to reduce the influence caused by this kind of noises, a geometric criterion is proposed to avoid selecting initial seeds on spurs. To find proper seed, each candidate seed on the rough coastline is considered by means of judging the angle between the candidate seed and its adjacent selected seed from a certain point on the land, which is the mirror of the radar image center, i.e., the own-ship position. This procedure is illustrated in Fig.2. Search for the follow-up seed to seed *<sup>i</sup> v* in the counter clockwise direction along the initial coastline, where *O* is the own-ship position, and *O* is its mirror point

Then, a family of smoothed coastlines is derived in the Gaussian scale-space, as shown in Fig.3. Scale-space is a special type of multi-scale representation that comprises a continuous

> *Lgf* (.; ) (.; )

( ) [exp( 2 )] 2 

electronic chart is rather smooth, the scale-space derivation is only done for the SAR image.

*<sup>i</sup>* between *<sup>i</sup> v* and the candidate point <sup>1</sup> *<sup>i</sup> v*

(12)

. Because the coastline in the

*<sup>i</sup>* is small or even negative. Bypass those kinds of points

*<sup>i</sup>* with *<sup>i</sup> v* is larger than a predefined threshold.

perpendicular to the course. Judge the angle

Fig. 2. Dispose the burr noise of radar coastline

where the Gaussian kernel is 2 2 *gx x*

scale parameter and preserves the same spatial sampling at all scales.

Fig. 3. Continuous smooth coastline in different scale spaces

from *O* . Spur may occur when

<sup>1</sup> *<sup>i</sup> v* until meet a point *i*<sup>1</sup> *v* whose angle

#### **3.4.2.1 Coarse registration in the phase domain**

The image registration technique based on Fourier-Mellin transform finds its applications in many different fields thanks to its high accuracy, robustness and low computational cost. It can be used to register images which are misaligned due to rotation, scaling and translation. The basic theory for translation estimation is the Fourier shift theorem. Denote

$$\mathbf{\mathcal{F}}\{f(\mathbf{x},y)\} \stackrel{\Lambda}{=} \mathbf{F}(w\_{\mathbf{x'}}w\_{y})\tag{13}$$

which is the Fourier transform of *f xy* (,) . Then

$$\mathbf{J}^{\mathbf{F}}\{f(\mathbf{x}+\Delta\mathbf{x},\mathbf{y}+\Delta\mathbf{y})\} = \mathbf{F}(\mathbf{w}\_{\mathbf{x}\prime}\mathbf{w}\_{\mathbf{y}})e^{j(\mathbf{w}\_{\mathbf{x}}\Delta\mathbf{x}+\mathbf{w}\_{\mathbf{y}}\Delta\mathbf{y}\_{0})}\tag{14}$$

And the image translation can be estimated by the cross-spectrum of the two images.

$$\frac{\left|F\_{1}(\mu,\upsilon)F\_{2}^{\*}(\mu,\upsilon)\right|}{\left|F\_{1}(\mu,\upsilon)F\_{2}^{\*}(\mu,\upsilon)\right|} = e^{j(w\_{x}\Lambda x + w\_{y}\Lambda y\_{0})}\tag{15}$$

Assuming *sxy* (,) is transformed image of *rxy* (,) after translation (,) *x y* , rotation and scaling (in both *x* and *y* directions).

$$s(\mathbf{x}, y) = r[\sigma(\mathbf{x}\cos a + y\sin a) - \Delta \mathbf{x}, \sigma(-\mathbf{x}\sin a + y\cos a) - \Delta y] \tag{16}$$

And *sxy* (,) will gain a two-dimensional pulse at the position of (,) *x y* in the (,) *x y* space. Then, the relation between the corresponding Fourier transform of *sxy* (,) and *rxy* (,) is:

$$s(\mu, \upsilon) = e^{-j\phi\_s(\mu, \upsilon)} R[\sigma^{-1}(\mu \cos a + \upsilon \sin a), \sigma^{-1}(-\mu \sin a + \upsilon \cos a)] \tag{17}$$

And the corresponding amplitude spectrum is:

$$\left| \mathbf{s}(\mu, v) \right| = \sigma^{-2} \left| \mathbb{R} [\sigma^{-1} (\mu \cos a + v \sin a), \sigma^{-1} (-\mu \sin a + v \cos a)] \right| \tag{18}$$

Then, the rotation angle and the scaling factor can be calculated in the log-polar coordinates.

$$s\_{p1}(\theta, \log \rho) = r\_{p1}(\theta - a, \log \rho - \log \sigma) \tag{19}$$

And evidently, *<sup>p</sup>*<sup>1</sup> *s* will gain a 2-D pulse at (,) in the Hemi-Polar-Log ( ,lo g ) coordinates. The phase-correlation method computes the transformation parameters by taking the curve as a whole, which takes the advantage of low computation cost and a good ability of noise immunity. This procedure is repeated in the Gaussian scalespace with a set of decreasing observing scales, and the two images are registered from rough to precise. And the transformation parameters are evaluated by clustering based on the evidence theory.

Remote Sensing Application in the Maritime Search and Rescue 393

The experimental images are obtained at a narrow channel in Yangzi River, China, May 1st, 2007. The own ship's position (OS POSN) is [32°13.525 N, 119°40.368 E], at speed of 13.8 knots and on course of 235.0°. The electronic chart of this area is a version in 2002. Many new docks are built, and moreover, the inland electronic chart in China uses the Gauss-Kruger coordinates, while the radar image uses the projected polar coordinates. Different

We choose a series of image sections from the electronic chart as the reference image, and take the radar image as image to be matched. The chart sections are selected along the coastline with a half size of the chart. And the registration is done between the chart section and the radar image. Because the two images are from different sensors, the coarse

Then, the estimated transformation parameters are clustered as ˆ ˆ ˆ ( , , , ) [ 1.9, 0.344, 353, 725]

Twenty-one pairs of control points are selected from both the derived SAR image and the chart at the scale level of 16 . Using the Housdoff distance metric, the transformation

The registration results are shown in Fig.6. The registration performance is evaluated by manually registering a remote sensing image from the Google Earth with the nautical chart. The registered image is at [32°13.369 N, 119°40.279 E], 7m distance from its true position, and the rotation bias is -1.1°. The result proves that our method is feasible. Errors come

This method is based on Gabor filter. According to Morrone and Owens theories, local energy is the image mean square response of filter set formed by an even symmetry filter *Me* and an odd symmetry filter *Mo* , and it gets the biggest local energy value at singular

*st t* ˆ *x y* .

parameters of second registration are obtained as ˆ ˆ ˆ ( , , , ) [ 1.9, 0.357, 371, 660]

 g 

) space, as

, *i* 1, ,6 .

coordinate systems also add extra deformation between the two images.

registration in single scale cannot carry out a prominent pulse in the ( ,lo

shown in Fig. 5. The registration procedure is repeated at scale levels 2*<sup>i</sup>*

**3.4.4 Experiment analysis** 

*st t* ˆ *x y* .

Fig. 5. The IFFT of spectrum in single scale

points, such as edges and corners.

from the strong echo of various objects near the shore.

**3.5 Wave direction estimation based on local energy orientation** 

#### **3.4.2.2 The selection of control point and registration seed**

The derived curve is transformed into graph, and the weight of each node is represented by the energy defined by,

$$E\_{\sigma} = w\_i \sigma^2 \langle v\_i, v\_{i-1} \rangle = w\_i \left[ \sum\_{\mathbf{x} \in \mathcal{V}\_i v\_{i-1}} \left( \mathbb{C}\_{\sigma}(\mathbf{x}) - \overline{\mathbb{C}}\_{\sigma}(\mathbf{x}) \right)^2 \right] \tag{20}$$

where *C* is the Gaussian curvature under scale defined by the coined product of the largest and the smallest curvatures of *i i*<sup>1</sup> *v v* .

$$\mathcal{L}\_{\sigma}(\upsilon\_i, \upsilon\_{i-1}) = L\_{vv}L\_w = L\_{xx}L\_y - 2L\_xL\_yL\_{xy} + L\_{yy}L\_x \tag{21}$$

where *Li* is the Laplacian operator. *C* turns out to be a good corner detector, which is an important invariant feature to describe the structure of a derived curve in certain scalespace. And scaled energy *E* is a three-order vector, which describes the variance of curvature.

The nodes with big *E* are selected as control points. On the local straight line points the Gaussian curvature is zero, and the connections of these points form a parabolic line. Then, every two adjacent parabolic lines construct a registration curve fragment. This method assures each seed curve contain the typical topology of the local region.

#### **3.4.3 Precise registration based on the principle curve graph**

The Hausdoff distance is adopted as the comparability metric, and the best matching feature curve fragment is obtained by using the minimum distance classifier. This procedure is repeated in the Gaussian scale-space with a set of decreasing observing scales, and the two images are registered from coarse to fine. The Haussdoff distance between the registration curve *N*1 and the reference curve *M*1 is defined by Equation(22).

$$D\_{\text{Hausdorff}}(\mathcal{N}\_1, \mathcal{M}\_1) = \max(d\_{\mathcal{F}}(\mathcal{N}\_1, \mathcal{M}\_1), d\_{\mathcal{B}}(\mathcal{N}\_1, \mathcal{M}\_1)) = \max\_{L\_j \neq \mathcal{N}\_1} \min\_{L\_i \neq \mathcal{M}\_1} \left( \left| L\_i - L\_j \right| \right) \tag{22}$$

If 1 2 *D NM Haussdoff* (, ) , then the two curves are matched. is a given threshold. The matching metric is shown as Fig.4.

Fig. 4. The Haussdoff distance metric

#### **3.4.4 Experiment analysis**

392 Remote Sensing – Applications

The derived curve is transformed into graph, and the weight of each node is represented by

assures each seed curve contain the typical topology of the local region.

**3.4.3 Precise registration based on the principle curve graph** 

curve *N*1 and the reference curve *M*1 is defined by Equation(22).

C1

C2

*i ii i*

1 2 2 <sup>1</sup> ( , ) ( ( ) ( )) 

  turns out to be a good corner detector, which is an

*i i vv w xx <sup>y</sup> <sup>x</sup> <sup>y</sup> xy yy <sup>x</sup>* (21)

are selected as control points. On the local straight line points the

is a three-order vector, which describes the variance of

(20)

defined by the coined product of the

is a given threshold. The

1 1

C1

C2

*j i*

 *i i*

*E w vv w C x C x*

<sup>1</sup> (, ) 2 *C v v L L L L LLL L L*

important invariant feature to describe the structure of a derived curve in certain scale-

Gaussian curvature is zero, and the connections of these points form a parabolic line. Then, every two adjacent parabolic lines construct a registration curve fragment. This method

The Hausdoff distance is adopted as the comparability metric, and the best matching feature curve fragment is obtained by using the minimum distance classifier. This procedure is repeated in the Gaussian scale-space with a set of decreasing observing scales, and the two images are registered from coarse to fine. The Haussdoff distance between the registration

1 1 11 11 ( , ) max( ( , ), ( , )) max min

, then the two curves are matched.

*Haussdoff F B <sup>i</sup> <sup>j</sup> L N L M*

*D NM dNM dNM L L* (22)

*x vv*

**3.4.2.2 The selection of control point and registration seed** 

is the Gaussian curvature under scale

largest and the smallest curvatures of *i i*<sup>1</sup> *v v* .

matching metric is shown as Fig.4.

Fig. 4. The Haussdoff distance metric

where *Li* is the Laplacian operator. *C*

space. And scaled energy *E*

The nodes with big *E*

If 1 2 *D NM Haussdoff* (, )

the energy defined by,

where *C*

curvature.

The experimental images are obtained at a narrow channel in Yangzi River, China, May 1st, 2007. The own ship's position (OS POSN) is [32°13.525 N, 119°40.368 E], at speed of 13.8 knots and on course of 235.0°. The electronic chart of this area is a version in 2002. Many new docks are built, and moreover, the inland electronic chart in China uses the Gauss-Kruger coordinates, while the radar image uses the projected polar coordinates. Different coordinate systems also add extra deformation between the two images.

We choose a series of image sections from the electronic chart as the reference image, and take the radar image as image to be matched. The chart sections are selected along the coastline with a half size of the chart. And the registration is done between the chart section and the radar image. Because the two images are from different sensors, the coarse registration in single scale cannot carry out a prominent pulse in the ( ,lo g ) space, as shown in Fig. 5. The registration procedure is repeated at scale levels 2*<sup>i</sup>* , *i* 1, ,6 . Then, the estimated transformation parameters are clustered as ˆ ˆ ˆ ( , , , ) [ 1.9, 0.344, 353, 725] *st t* ˆ *x y* .

Fig. 5. The IFFT of spectrum in single scale

Twenty-one pairs of control points are selected from both the derived SAR image and the chart at the scale level of 16 . Using the Housdoff distance metric, the transformation parameters of second registration are obtained as ˆ ˆ ˆ ( , , , ) [ 1.9, 0.357, 371, 660] *st t* ˆ *x y* .

The registration results are shown in Fig.6. The registration performance is evaluated by manually registering a remote sensing image from the Google Earth with the nautical chart. The registered image is at [32°13.369 N, 119°40.279 E], 7m distance from its true position, and the rotation bias is -1.1°. The result proves that our method is feasible. Errors come from the strong echo of various objects near the shore.

#### **3.5 Wave direction estimation based on local energy orientation**

This method is based on Gabor filter. According to Morrone and Owens theories, local energy is the image mean square response of filter set formed by an even symmetry filter *Me* and an odd symmetry filter *Mo* , and it gets the biggest local energy value at singular points, such as edges and corners.

Remote Sensing Application in the Maritime Search and Rescue 395

The wave image is filtered in this algorithm to eliminate speckles by way of Lee filter, and

The experiment uses the satellite ENVISAT-1 ASAR data of 30th Sep. to 19th Oct, and the experimental area covers 30°48′N ~ 31°20′N, 122°10′E ~ 122°47′E. We use the wave direction estimation based on local energy direction to calculate the wave direction for AP polarization data. The calculation results are compared with the JMH wave analysis chart from Japan Meteorological Agency. Table 3 shows the experiment result of this wave

> Wave Direction

Direction Energy

0° 124.3043

30 117.9377 60 132.5371 90 135.8590 120 131.9720 150 118.6072

0 111.2229 30 116.8233 60 133.8221 90 136.7364 120 132.8555 150 116.2534

0 78.0928

30 97.9058 60 130.5053 90 145.1742 120 130.5707 150 99.4210

0 77.5663 30 99.3430 60 133.5955 90 146.8708 120 133.2834 150 99.9747

0 14.0134

30 33.8942 60 133.1682 90 268.4664 120 132.8741 150 34.1923

Incident Angle

Estimated Direction

41.1016 75.1857

33.9364 104.2056

19.2636 99.9323

JMH Wave Analysis

on this basis the principal energy direction of the wave can be estimated.

Mode

VV

VH

HH

VV

Acquisition

2008-09-30 13:53

20081008 01:50

20081010

13:39 HH

Time (UTC) Image Polarization

Fig. 6. Registered image pairs. (left) radar image and nautical chart, (right) remote sensing image and nautical chart

$$E(\mathbf{x}, y) = \sqrt{(M\_e \ast f(\mathbf{x}, y))^2 + (M\_o \ast f(\mathbf{x}, y))^2} \tag{23}$$

The steerable filter is the linear combination of a set of base filters, which are partially overlapped in the frequency domain, and can be rotated. An orthogonal filter pair is a combination of a steerable filter and its Hilbert transformation, which is designed to detect precisely the features of the edge, texture and singular point of the target. To obtain the 2-D local energy in continuous frequency space, the Wavelet Transform is used to decompose the signal into a series of sub-band signals with particular frequencies. Here we use the Mexico-hat wavelet *G*<sup>2</sup> to build the steerable filter *G*<sup>2</sup> :

$$\mathbf{G}\_2^0 = k\_1(\theta)\mathbf{G}\_2^0 + k\_2(\theta)\mathbf{G}\_2^{\pi/3} + k\_3(\theta)\mathbf{G}\_2^{2\pi/3} \tag{24}$$

The Mexican-hat wavelet 2 22 2 <sup>2</sup> *G x*( , ) exp[ ( )] *y x y x* is a symmetric filter with sharp narrow bandwidth, thus it can effectively restrain noise and enhance the signal in particular frequency, and it is common used in multi-scale edge detection. The fundamental filters <sup>0</sup> *<sup>G</sup>*<sup>2</sup> , /3 *<sup>G</sup>*<sup>2</sup> , 2 /3 *G*<sup>2</sup> represent the forms of *G*2 rotating to 0, / 3 , 2 / 3 , respectively. ( ) *<sup>i</sup> k* is the interpolation function corresponding to the fundamental filters. Then, the form of *G*2 in any orientation is represented by the linear combination of <sup>0</sup> *G*<sup>2</sup> , /3 *G*<sup>2</sup> , 2 /3 *G*<sup>2</sup> . We can get the direction energy of arbitrary pixel (,) *x y* of the image in an arbitrary direction by using the orthogonal filter bank formed by steerable filter *G*<sup>2</sup> and its Hilbert transformation *H*<sup>2</sup> .

$$E^{\theta}(\mathbf{x}, \mathbf{y}) = \sqrt{\left(\mathbf{G}\_2^{\theta} \* f(\mathbf{x}, \mathbf{y})\right)^2 + \left(H\_2^{\theta} \* f(\mathbf{x}, \mathbf{y})\right)^2} \tag{25}$$

As for the singular characteristics, e.g. the edge, when the orthogonal filter moves to the same direction with this characteristic, the direction energy reaches maximum value. The corresponding direction of the local orientation energy is called the principal direction of the pixel's local energy.

Fig. 6. Registered image pairs. (left) radar image and nautical chart, (right) remote sensing

The steerable filter is the linear combination of a set of base filters, which are partially overlapped in the frequency domain, and can be rotated. An orthogonal filter pair is a combination of a steerable filter and its Hilbert transformation, which is designed to detect precisely the features of the edge, texture and singular point of the target. To obtain the 2-D local energy in continuous frequency space, the Wavelet Transform is used to decompose the signal into a series of sub-band signals with particular frequencies. Here we use the

2 1 22 2 3 2 *Gk Gk G k G* () () ()

narrow bandwidth, thus it can effectively restrain noise and enhance the signal in particular frequency, and it is common used in multi-scale edge detection. The fundamental filters <sup>0</sup> *<sup>G</sup>*<sup>2</sup> , /3 *<sup>G</sup>*<sup>2</sup>

the interpolation function corresponding to the fundamental filters. Then, the form of *G*2 in

2 2 *E xy G f xy H f xy* ( , ) ( ( , )) ( \* ( , ))

As for the singular characteristics, e.g. the edge, when the orthogonal filter moves to the same direction with this characteristic, the direction energy reaches maximum value. The corresponding direction of the local orientation energy is called the principal direction of the

represent the forms of *G*2 rotating to 0, / 3

the direction energy of arbitrary pixel (,) *x y* of the image in an arbitrary direction

any orientation is represented by the linear combination of <sup>0</sup> *G*<sup>2</sup> , /3 *G*<sup>2</sup>

:

0 /3 2 /3

 

2 2

   , 2 / 3 

<sup>2</sup> *G x*( , ) exp[ ( )] *y x y x* is a symmetric filter with sharp

(24)

, respectively. ( ) *<sup>i</sup> k*

 , 2 /3 *G*<sup>2</sup> 

(25)

and its Hilbert transformation *H*<sup>2</sup>

is

by using

.

. We can get

2 2 *Exy M f xy M f xy* ( , ) ( ( , )) ( \* ( , )) *e o* (23)

image and nautical chart

 , 2 /3 *G*<sup>2</sup> 

pixel's local energy.

Mexico-hat wavelet *G*<sup>2</sup> to build the steerable filter *G*<sup>2</sup>

The Mexican-hat wavelet 2 22 2

the orthogonal filter bank formed by steerable filter *G*<sup>2</sup>

The wave image is filtered in this algorithm to eliminate speckles by way of Lee filter, and on this basis the principal energy direction of the wave can be estimated.

The experiment uses the satellite ENVISAT-1 ASAR data of 30th Sep. to 19th Oct, and the experimental area covers 30°48′N ~ 31°20′N, 122°10′E ~ 122°47′E. We use the wave direction estimation based on local energy direction to calculate the wave direction for AP polarization data. The calculation results are compared with the JMH wave analysis chart from Japan Meteorological Agency. Table 3 shows the experiment result of this wave


Remote Sensing Application in the Maritime Search and Rescue 397

Integrated processing

Rescue area prediction&estimation Error analysis

> Vessel's report AIS data

Rescue vessel on the working field

module

SAR mission subscription module MSA

Fig. 7. The architecture of the Remote Sensing Monitoring System for Maritime Search and

estimation VTS data

With the development of astronavigation, the number of satellites installed with SAR sensors is increasing. Facing with so many satellites with different purposes, it has become a tough problem for clients to judge and select what they want quickly. RS-MSR sets up a real-time satellite coverage inquiry system including the commonly used satellites around the world, such as RadarSat, Envisat, ERS, CosmoSAR, TerraSAR, helping clients to retrieve

Ship detection module is the core unit of the whole RS-MSR system. Using the micro-area images of distressed areas supplied by way of satellite, it can detect and monitor the ships and the accidental areas, supplying clue of the distressed ship for search and rescue and helping to determine the areas quickly. This module consists three parts including: (1) ship detection; (2) ship classification/identification; and (3) ship direction and course

Sea state analysis module can perform initial analysis on the situation of the distressed area by estimating the wave direction and supply foundation for search and rescue decision, which is useful for estimating the floating direction and location of the distressed ships.

quickly the crossing time and the orbit data of these satellites at specific area.

Rescue (RS-MSR)

RS-MSR

SAR data

**4.1 Satellite transit inquiry module** 

Vessel detection module Vessel positioning Type classification Course heading estimation

Sea state analysis module

Wave direction

**4.2 Ship detection module** 

**4.3 Sea state analysis module** 

estimation.


Table 3. The experiment result of the wave direction estimation algorithm

direction estimation algorithm. The experimental result analysis shows that VV polarization mode is the best way for wave analysis, and the following is HH, while cross polarization VH and HV mode are not ideal.

#### **4. The architecture of the remote sensing aided maritime search and rescue system**

The Remote Sensing Monitoring System for Maritime Search and Rescue (RS-MSR) consists of four modules including satellite transit inquiry module, vessel detection module, sea state analysis module and integrated processing module. Ship detection module has three functions and they are ship location, ship type identification/classification and ship movement direction estimation. Sea state analysis mainly estimates the wave direction. The integrated processing module receives the detection results from ship detection module and sea state analysis module. According to the distressd ownship' position, heading and the wave direction, combining the time used for data receiving, it estimates the position of the distressed ship, and combining the satellite parameter, it can revise the result obtained through ship detection. The analytic result by way of integrated processing module can be transmitted to the Maritime Safety Administration (MSA) and the rescue vessel on the working field, providing assisting decisions of areas for the rescue work. Fig. 7 describes the architecture of RS-MSR.

Direction Energy

0 7.7873 30 19.1000 60 74.9717 90 153.2269 120 75.1100 150 18.9022

0 120.3329

30 115.1276 60 120.3069 90 116.8428 120 120.2216 150 118.1973

0 66.6512 30 69.6033 60 73.6283 90 71.8446 120 73.2198 150 69.0945

direction estimation algorithm. The experimental result analysis shows that VV polarization mode is the best way for wave analysis, and the following is HH, while cross polarization

**4. The architecture of the remote sensing aided maritime search and rescue** 

The Remote Sensing Monitoring System for Maritime Search and Rescue (RS-MSR) consists of four modules including satellite transit inquiry module, vessel detection module, sea state analysis module and integrated processing module. Ship detection module has three functions and they are ship location, ship type identification/classification and ship movement direction estimation. Sea state analysis mainly estimates the wave direction. The integrated processing module receives the detection results from ship detection module and sea state analysis module. According to the distressd ownship' position, heading and the wave direction, combining the time used for data receiving, it estimates the position of the distressed ship, and combining the satellite parameter, it can revise the result obtained through ship detection. The analytic result by way of integrated processing module can be transmitted to the Maritime Safety Administration (MSA) and the rescue vessel on the working field, providing assisting decisions of areas for the rescue work. Fig. 7 describes the

Incident Angle

Estimated Direction

44.0092 104.8161

JMH Wave Analysis

Wave Direction

Acquisition

20081019 13:56

**system** 

VH and HV mode are not ideal.

architecture of RS-MSR.

Time (UTC) Image Polarization

Mode

HV

HH

HV

Table 3. The experiment result of the wave direction estimation algorithm

Fig. 7. The architecture of the Remote Sensing Monitoring System for Maritime Search and Rescue (RS-MSR)
