**2. Methodology**

Various imaging techniques include infrared imaging [5, 6], passive millimeter-wave (MMW) imaging [7–9], active MMW imaging [10, 11], X-ray imaging [12, 13] and holographic imaging [14–17] have been investigated for concealed metallic object detection. Recent studies have demonstrated that passive MMW imaging has the most potential to become a useful tool to identify concealed MFOs under clothing [18, 19], which has been investigated for various

Passive MMW imaging sensor techniques offer the best near-term potential for providing a non-invasive method of observing metallic and plastic objects concealed underneath common clothing. However, MMW cameras alone cannot provide useful information about the detail and location of the individual being monitored. The passive MMW system can produce indoor and outdoor images in bad weather, such as smoke and fog [21]. It has been applied to scan human subjects moving in an unconstrained flow, however, the MMW image has poor quality due to low-level signals and system noise [5]. In order to improve the accuracy and specificity of MMW for diagnosing concealed MFOs, many researchers aimed to develop a

Holographic technique was first applied in microwave imaging in 1948 [22]. An interference pattern between reference wave and diffracted wave (created by an object) is recorded to produce a hologram that can be digitally stored. The holograms are reconstructed by numerically synthesizing the reference wave, which is well-known wave front reconstruction processing. The target object can be reconstructed from the measured reflections and holograms. Holographic approaches are very different from the conventional synthetic aperture radar imaging method particularly in imaging geometry and no field approximations are required for holographic, which have recently been applied in MMW for MFOs detection [14, 15]. Farhat and Guard [14] applied the holographic approach for concealed weapon detection, and this technique was dramatically improved by Collins et al. [15]. The concealed MFOs detection system aims at extracting features of MFOs and reconstructing the MFOs using the measured data. The image quality is often limited by low signal-to-noise ratio and long scan time. Existing MMW methods are multi-frequency approach, which reconstruct a 3D image from a sequence of 2D images that obtained at different frequencies. However, the multifrequency MMW methods have difficulty in practical implementations and the broadband

This chapter demonstrates the feasibility of using a single frequency 3D holographic millimeter-wave (HMMW) imaging system and method to detect various small MFOs in inhomogeneous medium. A computer model is developed under MATLAB environment to validate the proposed theory and measurement system setups. The system contains a HMMW measurement model and various realistic models. Simulation and experimental validations are performed to evaluate the accuracy, effectiveness and performance of the proposed theory. The remainder of this chapter is organized as follows. Section 2 introduces the 3D HMMW measure system and imaging processing. Sections 3 and 4 present simulation and experimen-

tal performances. Section 5 gives discussion and conclusion of this study.

applications include security, military, surveillance and biomedical [20].

126 Emerging Microwave Technologies in Industrial, Agricultural, Medical and Food Processing

new MMW approach such as imaging algorithm and implement system.

measurements also cause large noises.

#### **2.1. Imaging measurement system**

**Figure 1** shows the proposed HMMW system for concealed metallic object detection. The system contains a RF generator (vector network analyzer, VNA) to illuminate microwave signals, a data acquisition unit consists of a single transmitter to transmit microwave signals into a target object and an array of receivers to measure scattered electric fields from the target object, a signal and imaging processor to analyze the measured signals which contains phase and amplitude information as well as reconstruct image of the target object using an imaging algorithm, and an image display unit to display the reconstructed image.

During data collection, port one of the VNA generates millimeter waves to the object of interest and the backscattered electromagnetic fields from the object are recorded at each receiver in the detector array plane that is connected to the second port of VNA. The distance between the target object and the data acquisition unit is in far-field region. The recorded signals include phase and amplitude information, which are used to compute the complex visibility data for each possible pair of receivers. An image of the object can be reconstructed from recorded data using the HMMW algorithm.

**Figure 1.** Experimental procedure.

#### **2.2. Antenna part**

In order to investigate the feasibility of using the 3D HMMW to detect concealed MFOs, 16 four-band patch antennas or waveguide antennas were simulated as both transmitters and detectors. **Figure 2** shows the designed four-band patch antenna with length of 10.7 mm, width of 6.3 mm, and height of 0.254 mm. As shown in **Figure 2(b)** and **(c)**, top layer and bottom layer of the proposed antenna contain 12 holes (0.15 mm in diameter) that aims to work in four broadband. The subtract material between the two layers of the antenna is RT/ duriod6002 with dielectric property close to 1.

The incident electric field from each transmitter is [23]:

→ *scat*(r <sup>→</sup>) = ( *k*0 2 \_\_\_ <sup>4</sup>*π*) ∫

means the distance between the object and the detector.

can be computed [25]:

Where \* is the complex conjugate and < > denotes time average.

<sup>→</sup>) = ( *k*0 2 \_\_\_ <sup>4</sup>*π*) 2 <sup>|</sup>*ε*(*<sup>s</sup>* <sup>→</sup>) <sup>−</sup> *<sup>ε</sup>*0| 2 E → *T* (*s* <sup>→</sup>) ∙ →E*T* ∗(*s* →′

→

*ij* = < *E* → *scat*( <sup>→</sup>*ri*) ∙ *E* → *scat* ∗ (

<sup>→</sup> = ∑ *i N V* →

All detectors are located on the same plane, thus, define the line integral as:

(*l*, *m*) = ∫ *s*

<sup>→</sup>*xj*) <sup>−</sup> (

~

*inc*(*R*, *<sup>θ</sup>*, <sup>∅</sup>) <sup>=</sup> (<sup>−</sup> *<sup>j</sup> <sup>k</sup>* \_\_\_0

is polarization vector.

is prorogation constant of free-space, *R* and *R*<sup>0</sup>

<sup>2</sup> *<sup>π</sup>*<sup>2</sup>) <sup>E</sup> → 0( *e* <sup>−</sup>*jk*<sup>0</sup> *<sup>R</sup>* \_\_\_\_0

guide aperture, A and B are narrow and wide aperture dimensions of antenna, respectively,

As shown in **Figure 3**, a point Q is located within a 3D object, the visibility for any two detec-

<sup>→</sup>) − *ε*0) E → *inc*(*s* <sup>→</sup>) *<sup>e</sup>* <sup>−</sup>*jk*<sup>0</sup> *<sup>R</sup>* \_\_\_\_

are the complex relative permittivity of object and free-space, respectively. R

<sup>→</sup> as [26]:

*I*(*s*, *l*, *m*) \_\_\_\_\_\_\_

\_\_\_\_\_\_\_\_ 1 − *l*

, *<sup>v</sup>* <sup>=</sup> ((

<sup>→</sup>*yj*) <sup>−</sup> (

<sup>→</sup>*yi*))/*λ*<sup>0</sup> , *λ*0

√

<sup>→</sup>*xi*))/*λ*<sup>0</sup>

In far-field condition, the scattered electric field from the object can be computed as [24]:

*V* (*ε*(*s*

→ 0 *<sup>R</sup>*<sup>0</sup> )*ABh*(*θ*, <sup>∅</sup>)P

3D Holographic Millimeter-Wave Imaging for Concealed Metallic Forging Objects Detection

<sup>→</sup>(*θ*, ∅) (1)

129

are the distance from the object to the

http://dx.doi.org/10.5772/intechopen.73655

*<sup>R</sup> dV* (2)

<sup>→</sup>*rj*) <sup>&</sup>gt; (3)

) (5)

means the wavelength

<sup>2</sup> <sup>−</sup> *<sup>m</sup>*<sup>2</sup> *ds* (6)

*ij*, *N* ≥ 3, *i* ≠ *j* (4)

is wave amplitude of TE10 mode at within wave-

→

detector and transmitter, respectively. E

→

E

*h* is radiation pattern, P

E

**2.3. Signal and imaging processing**

*i* and <sup>→</sup>*r j*

*V*

*V*

*I*(*s*

*I*

Where *<sup>l</sup>* <sup>=</sup> *sincosϕ* and *<sup>m</sup>* <sup>=</sup> *sinsinϕ*. *<sup>u</sup>* <sup>=</sup> ((

of free-space.

The total visibility data can be computed as:

Define the object intensity distribution at position *s*

<sup>→</sup>) and ε<sup>0</sup>

Where k0

Where ε(s

tors located at <sup>→</sup>*r*

**Figure 2.** (a) Multiband patch antenna; (b) top layer of the designed antenna: (1) top antenna, (3) holes, (4 and 5) trumpet holes; (c) bottom layer of the antenna: (2) bottom antenna, (7) feed probe hole; (d) sideview of the antenna: (6) feed probe, (8) dielectric layer, (9) ground plate; (e) sensor array configuration.

The incident electric field from each transmitter is [23]:

**2.2. Antenna part**

duriod6002 with dielectric property close to 1.

128 Emerging Microwave Technologies in Industrial, Agricultural, Medical and Food Processing

In order to investigate the feasibility of using the 3D HMMW to detect concealed MFOs, 16 four-band patch antennas or waveguide antennas were simulated as both transmitters and detectors. **Figure 2** shows the designed four-band patch antenna with length of 10.7 mm, width of 6.3 mm, and height of 0.254 mm. As shown in **Figure 2(b)** and **(c)**, top layer and bottom layer of the proposed antenna contain 12 holes (0.15 mm in diameter) that aims to work in four broadband. The subtract material between the two layers of the antenna is RT/

**Figure 2.** (a) Multiband patch antenna; (b) top layer of the designed antenna: (1) top antenna, (3) holes, (4 and 5) trumpet holes; (c) bottom layer of the antenna: (2) bottom antenna, (7) feed probe hole; (d) sideview of the antenna: (6) feed probe,

(8) dielectric layer, (9) ground plate; (e) sensor array configuration.

$$\vec{\mathbf{E}}\_{\rm inc}(\mathbf{R}, \Theta, \mathfrak{O}, \mathfrak{Q}) = \left(-\frac{j k\_{\mathrm{o}}}{2\pi^{2}}\right) \vec{\mathbf{E}}\_{\rm o} \left(\frac{e^{-j k\_{\mathrm{i}} \theta\_{\mathrm{i}}}}{R\_{\mathrm{o}}}\right) \text{ABh}(\Theta, \mathfrak{Q}) \vec{\mathbf{P}}(\Theta, \mathfrak{Q}) \tag{1}$$

Where k0 is prorogation constant of free-space, *R* and *R*<sup>0</sup> are the distance from the object to the detector and transmitter, respectively. E → 0 is wave amplitude of TE10 mode at within waveguide aperture, A and B are narrow and wide aperture dimensions of antenna, respectively, *h* is radiation pattern, P → is polarization vector.

In far-field condition, the scattered electric field from the object can be computed as [24]:

$$
\vec{\mathbf{E}}\_{\text{sat}}(\vec{\mathbf{r}}) = \left(\frac{k\_0^2}{4\pi}\right) \int\_V (\varepsilon(\vec{s}) - \varepsilon\_0) \,\vec{\mathbf{E}}\_{\text{inc}}(\vec{s}) \frac{\varepsilon^{\to k}}{R} \,dV \tag{2}
$$

Where ε(s <sup>→</sup>) and ε<sup>0</sup> are the complex relative permittivity of object and free-space, respectively. R means the distance between the object and the detector.

#### **2.3. Signal and imaging processing**

As shown in **Figure 3**, a point Q is located within a 3D object, the visibility for any two detectors located at <sup>→</sup>*r i* and <sup>→</sup>*r j* can be computed [25]:

$$
\vec{V}\_{\neq} = <\vec{E}\_{sat}(\vec{r}\_{l}^{\prime}) \cdot \vec{E}\_{sat}^{\prime}(\vec{r}\_{l}^{\prime}) > \tag{3}
$$

Where \* is the complex conjugate and < > denotes time average.

The total visibility data can be computed as:

$$
\vec{V} = \sum\_{i}^{N} \vec{V}\_{ij} \,\mathrm{N} \ge \mathbf{3}, i \not\le j \tag{4}
$$

Define the object intensity distribution at position *s* <sup>→</sup> as [26]:

$$I\{\vec{s}\} = \left(\frac{k\_0^2}{4\pi}\right)^2 \left|\varepsilon(\vec{s}) - \varepsilon\_0\right|^2 \vec{\mathcal{E}}\_\gamma(\vec{s}) \cdot \overrightarrow{\mathcal{E}}\_\gamma^\dagger(\vec{s}^\prime) \tag{5}$$

All detectors are located on the same plane, thus, define the line integral as:

$$\widetilde{I}(l,m) = \int \frac{I(s,l,m)}{\sqrt{1-l^2-m^2}}ds\tag{6}$$

Where *<sup>l</sup>* <sup>=</sup> *sincosϕ* and *<sup>m</sup>* <sup>=</sup> *sinsinϕ*. *<sup>u</sup>* <sup>=</sup> (( <sup>→</sup>*xj*) <sup>−</sup> ( <sup>→</sup>*xi*))/*λ*<sup>0</sup> , *<sup>v</sup>* <sup>=</sup> (( <sup>→</sup>*yj*) <sup>−</sup> ( <sup>→</sup>*yi*))/*λ*<sup>0</sup> , *λ*0 means the wavelength of free-space.

**Figure 3.** (a) Geometry of two detectors, (b) scattering characterization scheme from different receiving height [24].

A 2D image is obtained by using inversion Fast Fourier Transformation:

$$\widetilde{I}(l,m) = \iint V(\mu, \upsilon) \, e^{j2\pi(\mu l + m)} \, dldm \tag{7}$$

Where *z*

*<sup>n</sup>* = *s*

~ Zn .

selected height Hn

2D images, I

**3. Simulation**

3D HMMW [27].

object, *A* 4: skin, *A* 5: skull, *A* 6: fat).

*<sup>n</sup>*(*cosθn*), *θ<sup>n</sup>*

(**Figure 3(b)**).

is the receiving angle of the position sn

A 3D image can be reconstructed by acquiring the measured 2D intensity distributions when the sensor array plane is placed at different vertical locations, and computing a sequence of

3D Holographic Millimeter-Wave Imaging for Concealed Metallic Forging Objects Detection

A numerical system was developed under MATLAB environment to investigate the proposed theory and system for diagnosing concealed MFOs. An array of 16 open-ended waveguide antennas with one element for transmitter and others for receivers. The target object was located at *z* = 0 mm and it was assumed to be fully contained in a rectangle imaging domain with length of 300 mm. The sensor array plane was placed at *z* = −200 m. Five models (see **Figure 4**) were developed using the published dielectric properties to evaluate the

Model I was made of two metallic spheres (10 mm in diameter, x1 = 0, *y*<sup>1</sup> = 0, *z*<sup>1</sup> = 35, x2 = 50, *y*<sup>2</sup> = 0, *z*<sup>2</sup> = 35) embedded in a cylindrical tank (240 mm in diameter and 70 mm in height) filled of clothing material; Model II was made of two wood spheres (5 mm in diameter, x1 = 0, *y*<sup>1</sup> = 0, *z*<sup>1</sup> = 35) embedded in a cylindrical tank; Model III was made of two wood

**Figure 4.** (a) Model I, (b) Model II, (c) Model III, (d) Model IV, (e) Model V (*A* 1: matching medium, *A* 2: cloth, *A* 3: metallic

with the sensor plane placed at the

http://dx.doi.org/10.5772/intechopen.73655

131

The 2D image difference between each two 2D images is computed by differentiating 2D images when the sensor array plane is placed at different heights (H):

$$I\{H = z\_{n'}l\_{\prime}m\} = d\,\tilde{I}\,(l\_{\prime}m) \cdot (1 - l^2 - m^2)/dz\tag{8}$$

$$d\tilde{\mathbf{I}}/d\mathbf{z} = \left(\tilde{\mathbf{I}}\_{\mathbf{Z}\_\*} - \tilde{\mathbf{I}}\_{\mathbf{Z}\_{\*i}}\right) / \left(\mathbf{Z}\_n - \mathbf{Z}\_{\*i}\right) \tag{9}$$

Where *z <sup>n</sup>* = *s <sup>n</sup>*(*cosθn*), *θ<sup>n</sup>* is the receiving angle of the position sn with the sensor plane placed at the selected height Hn (**Figure 3(b)**).

A 3D image can be reconstructed by acquiring the measured 2D intensity distributions when the sensor array plane is placed at different vertical locations, and computing a sequence of 2D images, I ~ Zn .
