**1. Introduction**

Planty of the multiobjective (MO) optimization problems lie in the engineering field. The objective functions are contradictory, the optimal solutions of these problems are known by the Pareto front. To get the optimal solutions of these problems, evolutionary algorithms (EAs) have been recognized to be very efficient in solving MO optimization problems by finding a representative Pareto front in one run. State-of-the-art algorithms are presented in [1–5]. These algorithms, which are population-based, are able to simultaneously explore various regions of the Pareto front.

In last past few years, immune systems have inspired new algorithms for resolutions OM problems. The fundamental principles of the artificial immune system (AIS) algorithm are clonal selection, [6] mutation, and more recently, recombination [7–15].

The nondominated neighbor-based immune optimization algorithm (NNIA) is effective to deal with MO problems [9]. NNIA has proved that it is advantageous to incorporate a crossover operator into the algorithm. To do this, it uses simulated binary crossover (SBX). But SBX is a recombination operator, which performs search near the recombination parent [16].

Backtracking Search Optimization Algorithm (BSA) is a new nature-inspired algorithm proposed by [17]. BSA's special mechanism to ensure a trial individual is effective, ability to learn fast solving different numerical optimization problems sequentially and quickly, with a clear structure. Since it was introduced, the BSA has attracted many research studies, and it has been applied to various optimization problems [18–20].

In this chapter, we develop a novel hybrid MO immune method to solve the problems of continuous multiobjective optimization. The NNIA algorithm uses the best individuals in the trial population to drive the search to Pareto front. But NNIA uses SBX, which mainly has local search capability. In our proposal, the recombination is inspired by the cross used in the BSA algorithm, but adaptations are found to fit the immune algorithm. Therefore, three variants are considered in the context of this chapter, which gives rise to new recombination operators for immune algorithm.

This chapter is organized as follows: In Section 2, we introduce the MO problem. In Sections 3 and 4, NNIA algorithm and BSA recombination are presented, and we propose new algorithms to solve the MO problem. The effectiveness of these algorithms is investigated in Section 5 when confronting to various MO test problems. In Section 6, the chosen algorithm is applied to solve the multiobjective topology optimization of truss structures. A short summary is proposed to conclude this paper.

### **2. Multiobjective optimization problem**

The multiobjective optimization problem is formalized in this section. Concepts related to the Pareto front are introduced, firstly from a theoretical point of view and then when considering its approximation through a numerical approach [21].

#### **2.1 Pareto front**

Let us consider the following multiobjective optimization (MO) problem:

$$\min\_{\mathbf{x}\in\Omega} \mathbf{f}(\mathbf{x}) = \left(f\_1(\mathbf{x}), \dots, f\_m(\mathbf{x})\right)^\mathrm{T} \tag{1}$$

where *m* is the number of objective functions, **x** ¼ ð Þ x1, ⋯, x*<sup>n</sup>* ∈ Ω is the *nx*-dimensional decision space where each decision variable *xi* is bounded by lower and upper limits *xli* ≤*xi* ≤*xui* for *i* ¼ 1, ⋯, *nx*.

Pareto-front-related concepts are [22]:

1.*Pareto dominance: Suppose* **x***<sup>a</sup> and* **x***<sup>b</sup> are two different feasible solutions to the MO problem. Then* **x***<sup>a</sup> dominates* **x***<sup>b</sup> if and only if*

$$f\_i(\mathbf{x}\_a) \le f\_i(\mathbf{x}\_b) / \forall i \in \{1, \dots, m\} \tag{2}$$

*An Immune Multiobjective Optimization with Backtracking Search Algorithm Inspired… DOI: http://dx.doi.org/10.5772/intechopen.100306*

and:

$$\exists k \in \{1, \ldots, m\} \; f\_k(\mathbf{x}\_a) < f\_k(\mathbf{x}\_b) \tag{3}$$


$$F^\* = \left\{ \mathbf{f}(\mathbf{x}^\*) = \left( f\_1(\mathbf{x}^\*), \dots, f\_m(\mathbf{x}^\*) \right)^T \text{ such that } \vdots \ \mathbf{x}^\* \in X^\* \right\} \tag{4}$$

#### **2.2 Multiobjective solution**

Otherwise, the following terms are cited in the reference [9]:


$$\text{CD}\left(\widehat{\mathbf{X}}\right) = \sum\_{j=1}^{m} \frac{D\_j\left(\widehat{\mathbf{X}}\right)}{f\_j^{\text{max}} - f\_j^{\text{min}} + \varepsilon\_D} \tag{5}$$

where *f* max *<sup>j</sup>* and *f* min *<sup>j</sup>* are maximal and minimal values of the *j*-th objective respectively, *ε<sup>D</sup>* is a small number and:

$$D\_{\hat{f}}\left(\hat{\mathbf{X}}\right) = \begin{cases} \rightsquigarrow & \text{if } \quad \hat{\mathbf{x}}\_{k} \text{ is a boundary point of } \hat{\mathbf{X}}\\ \min \left| f\_{k}\left(\hat{\mathbf{X}}\right) - f\_{l}\left(\hat{\mathbf{X}}\right) \right| & \text{otherwise} \end{cases} \tag{6}$$

for *k*, *l* ∈f g 1, ⋯, *nx* such that *k* 6¼ *l* 6¼ *j*.

#### **3. Immune optimization algorithm and recombination operator**

#### **3.1 Nondominated neighbor immune optimization algorithm**

Nondominated neighbor immune algorithm (NNIA), using the nondominated neighbor-based selection and proportional cloning, pays more attention to the lesscrowded regions of the current trade-off front.

We denote by **X**b the dominant population, **X***<sup>A</sup>* the active population and **X***<sup>C</sup>* the clone population at time *t* are stored by time-dependent variable matrices, Their sizes are *nX*^ , *nA*, and *nC* respectively.

The NNIA algorithm is presented in 1 where: = is the update operator. *nD*, *nX*^ max , *nA* max , *nC*, and *nit*, the number of iterations.

**Algorithm 1** Pseudo code of NNIA

**FunctionX**<sup>b</sup> <sup>¼</sup> *NNIA nx*, *<sup>m</sup>*,**f x**ð Þ, **<sup>x</sup>***l*, **<sup>x</sup>***u*, *nX*^ max , *nA* max , *nC*, *nit* � �

1: Generate a uniform random initial population **X**b of size *nC* � *nx* in respect to **x***<sup>l</sup>* and **x***u*;

$$\begin{aligned} &2: \widehat{\mathbf{X}} := \widehat{\mathbf{F}} \text{find\\_Non-Domainated} \left( \widehat{\mathbf{X}} | \mathbf{f}(\mathbf{x}) \right); \\ &3: \text{for } t = 0: n\_{it}, \text{ do} \\ &4: \quad \mathbf{X}\_{A} := \text{CD\\_Tuncation} \left( \widehat{\mathbf{X}}, n\_{A \max} \right); \\ &5: \quad \mathbf{X}\_{C} := \text{Closing} \left( \mathbf{X}\_{A}, n\_{C} \right); \\ &6: \quad \mathbf{X}\_{C} := \text{Recombination} \left( \mathbf{X}\_{C}, \mathbf{X}\_{A}, \mathbf{x}\_{l}, \mathbf{x}\_{u} \right); \\ &7: \quad \mathbf{X}\_{C} := \text{Hypermutation} \left( \mathbf{X}\_{C}, \mathbf{x}\_{l}, \mathbf{x}\_{u} \right); \\ &8: \quad \widehat{\mathbf{X}} := \text{Find\\_Non-Domainated} \left( \left[ \widehat{\mathbf{X}}, n\_{C} \right] \left| \mathbf{f}(\mathbf{x}) \right); \\ &9: \quad \widehat{\mathbf{X}} := \text{CD\\_Tuncation} \left( \widehat{\mathbf{X}}, n\_{\widehat{\mathbf{X}} \max} \right); \\ &\mathbf{end for} \end{aligned}$$

#### **3.2 Recombination and crossovers**

#### *3.2.1 NNIA recombination*

For a recombination, operation of NNIA has been adopted in many MO EAs [1, 4], an antibody of the cloning population and an antibody of the active population are selected and modified as:

$$\{\mathbf{X}\_{C}\}\_{\vec{\eta}} = \begin{cases} \frac{1-\beta}{2} \{\mathbf{X}\_{C}\}\_{\vec{\eta}} + \frac{1+\beta}{2} \{\mathbf{X}\_{A}\}\_{k\vec{\eta}} & \text{if} \quad a=1 \ \&\ b=0\\ \frac{1+\beta}{2} \{\mathbf{X}\_{C}\}\_{\vec{\eta}} + \frac{1-\beta}{2} \{\mathbf{X}\_{A}\}\_{k\vec{\eta}} & \text{if} \quad a=1 \ \&\ b=1\\ \{\mathbf{X}\_{C}\}\_{\vec{\eta}} & \text{if} \quad a=0 \end{cases} \quad |a \sim ll(0,1), \ b \sim ll(0,1)$$

for *i*∈ f g 1, … , *nC* , *j*∈f g 1, … , *nx* , and *k* a random integer uniformly chosen in f g 1, … , *nA* . Above, U is the uniform discrete distribution, and *β* is a sample from a random distribution having the density:

$$p(\beta) = \begin{cases} 0 & \text{if} \quad \beta < 0\\ \frac{\eta + 1}{2} \beta^{\eta} & \text{if} \quad 0 \le \beta \le 1\\ \frac{\eta + 1}{2 \beta^{\eta + 2}} & \text{if} \quad \beta > 1 \end{cases}$$

where *η* is a real nonnegative distribution. Hence, four independent random variables are involved in this recombination operation: *a*, *b*, *k*, and *β*. A boundary control is performed by:

*An Immune Multiobjective Optimization with Backtracking Search Algorithm Inspired… DOI: http://dx.doi.org/10.5772/intechopen.100306*

$$\{\mathbf{X}\_{C}\}\_{\dot{\boldsymbol{\eta}}} \coloneqq \begin{cases} \boldsymbol{\varkappa}\_{li} & \text{if } & \{\mathbf{X}\_{C}\}\_{\dot{\boldsymbol{\eta}}} < \boldsymbol{\varkappa}\_{li} \\ \{\mathbf{X}\_{C}\}\_{\dot{\boldsymbol{\eta}}} & \text{if } & \boldsymbol{\varkappa}\_{li} \le \{\mathbf{X}\_{C}\}\_{\dot{\boldsymbol{\eta}}} \le \boldsymbol{\varkappa}\_{ui} \\ \boldsymbol{\varkappa}\_{ui} & \text{if } & \{\mathbf{X}\_{C}\}\_{\dot{\boldsymbol{\eta}}} > \boldsymbol{\varkappa}\_{ui} \end{cases}$$

*3.2.2 Crossover operator of backtracking search optimization algorithm*

Crossover strategy of BSA [17]. It consists in mixing two input populations **X***<sup>P</sup>* and **X***<sup>Q</sup>* to form a new output population **X***R*, of equal sizes: *nX* � *nx*. Then, BSA' crossover reads:

$$\{\mathbf{X}\_R\}\_{\vec{\eta}} \coloneqq \begin{cases} \{\mathbf{X}\_P\}\_{\vec{\eta}} & \text{if} \quad \mathbf{T}\_{\vec{\eta}} = \mathbf{0} \\ \{\mathbf{X}\_Q\}\_{\vec{\eta}} & \text{if} \quad \mathbf{T}\_{\vec{\eta}} = \mathbf{1} \end{cases} \tag{8}$$

for *i*∈ f g 1, … , *nX* , *j*∈ f g 1, … , *nx* and where **T** is a boolean matrix of sizes: *nX* � *nx*, which is formed by following the algorithm 2.

To control the amount of mixing between the two populations **X***<sup>P</sup>* and **X***<sup>Q</sup>* , we must define the parameter *η* such that 0 <*η*≤*nx*. Moreover, we perform a random permutation on the lines of the **X***<sup>P</sup>* population before applying the relation (8).

**Algorithm 2** Algorithm for the generation of the T matrix used in the BSA crossover

```
1: Initialize T ≔ 0 and a ≔ Uð Þ 0, 1 ;
2: if a ¼ 0 then
3: for i ¼ 1 : nX do
4: u ≔ Permutingð Þ 1 : nx ;
5: b ≔ Uð Þ 0, η ;
6: for k ¼ 1 : b do
7: j ¼ uk;
8: Tij ¼ 1;
9: end for
10: end for
11: else
12: for i ¼ 1 : nX do
13: j ≔ Uð Þ 0, nx ;
14: Tij ¼ 1;
15: end for
16: end if
```
## **4. Recombination propositions for an hybrid algorithm**

To get a more efficient immune algorithm, we will propose a hybridization method, which consists of exchanging the crossover operator used for recombination in NNIA with a new recombination operator inspired by BSA [23].

To find this new algorithm, we have to use two ideas:

1.The first idea consists of expanding the active population in order to obtain an extended active population, having its size equal to the clonal population size. The simplest way to achieve this consists of duplicating the active population;

2.The second idea consists of replacing the active population for the crossover by the clonal population, leading finally to a crossover that uses only the clonal population.

## **5. Experiments**

In this section, we study the performance of the hybridization when solving some well-known MO techniques including five ZDT problems [24] and five DTLZ problems [25].

An optimization problem is typically written as:

$$\min\_{\mathbf{x}\in\Omega} \quad \mathbf{f}(\mathbf{x}) = \left(f\_1(\mathbf{x}), \dots, f\_m(\mathbf{x})\right)^T \tag{9}$$

where *m* is the number of objective functions, **x** ¼ ð Þ x1, ⋯, x*<sup>n</sup>* ∈ Ω is the *nx*dimensional decision space where each decision variable *xi* is bounded by lower and upper limits *xli* ≤ *xi* ≤*xui* for *i* ¼ 1, ⋯, *nx*.

#### **5.1 Performance metrics**

Approximate Pareto front solution of MO algorithms must achieve these two goals:


For benchmark test problems, the true Pareto front is known, allowing to exploit performance metrics that used it.

We opted for two performance metrics for assessing algorithms efficiency. To measure the extent of the convergence to the true set of Pareto-optimal solutions and the spread of the Pareto front set, a normalized version of the inverted generational distance (IDG) metric proposed by [26] is adopted, while a normalized version of the spacing metric introduced by [27] enables to measure the uniformity of the obtained solutions.

#### *5.1.1 Normalized inverted generational distance*

The normalized inverted generational distance (NIGD) is based on a proposition of [26]. For measuring of the distance between the true Pareto front *F* <sup>∗</sup> , which is known at *<sup>n</sup>*<sup>∗</sup> discrete values—and stored in the matrix **F X**<sup>∗</sup> ð Þ—and approximate solutions of the Pareto-optimal front **F X**b � �:

$$\text{NIGD}\left(\mathbf{F}(\mathbf{X}^\*), \mathbf{F}\left(\widehat{\mathbf{X}}\right)\right) = \frac{1}{n^\*} \sqrt{\sum\_{j=1}^{n^\*} \mathbf{c}\_j^2},\tag{10}$$

for:

$$\mathbf{c}\_{j} = \min\_{i \in \{1, \ldots, n\_X\}} \left( \sqrt{\sum\_{k=1}^{m} \left( \overline{\mathbf{F}\_k(\mathbf{X}\_i^\*)} - \overline{\mathbf{F}\_k(\hat{\mathbf{X}}\_j)} \right)^2} \right), \qquad j \in \{1, \ldots, n^\*\}.$$

*An Immune Multiobjective Optimization with Backtracking Search Algorithm Inspired… DOI: http://dx.doi.org/10.5772/intechopen.100306*

where • denotes a normalized objective function, ranging from 0 to 1 and defined by:

$$\overline{\mathbf{F}}\_{k}(\mathbf{x}) = \frac{\mathbf{F}\_{k}(\mathbf{x}) - \min \left( \mathbf{F}\_{k}(\mathbf{X}^{\*}) \right)}{\max \left( \mathbf{F}\_{k}(\mathbf{X}^{\*}) \right) - \min \left( \mathbf{F}\_{k}(\mathbf{X}^{\*}) \right)}. \tag{11}$$

To obtain smaller values of this measure, the approximated set **F X**b � � must be very close to the Pareto front and cannot miss any part of the whole Pareto front at the same time.

#### *5.1.2 Normalized spacing measure*

The spacing metric introduced by [27] is modified by taking normalized objectives functions. This leads to the normalized NSP measure, defined by:

$$\text{NSP}(\mathbf{F}(\hat{\mathbf{X}})) = \sqrt{\frac{1}{n\_X - 1} \sum\_{j=1}^{n\_X} \left(\mathbf{d}\_j - \overline{d}\right)^2} \tag{12}$$

for:

$$\mathbf{d}\_{j} = \min\_{\substack{i \in \{1, \ldots, n\_X\} \\ i \neq j}} \left( \sqrt{\sum\_{k=1}^{m} \left( \overline{\mathbf{F}}\_{k} (\hat{\mathbf{X}}\_{i}) - \overline{\mathbf{F}}\_{k} (\hat{\mathbf{X}}\_{j}) \right)^{2}} \right), \qquad j \in \{1, \ldots, n\_X\}.$$

where *d* is the mean of **d**.

#### **5.2 Empirical comparison**

In this section, performance of five NNIA variants are evaluated. The five variants are:


For NNIA, parameters proposed in Ref. [9] are set:


with the distribution index for SBX that is 15, the distribution index for polynomial mutation that is 20 and the mutation probability of 1*=nx* and the number of iterations stopped at 250. For NNIA+X3, the proportion of random individuals is chosen to be equal to *nA* and their distribution is uniform.

**Figures 1** and **2** show the statistic box plots for NIGD and NSP obtained for 1000 independent runs performed on each test problems ZDT and DTLZ that are chosen by [9]<sup>1</sup> .

#### **Figure 1.**

*NIGD obtained from 1000 independent runs of problems ZDT1, ZDT2, ZDT3, ZDT4, ZDT6, DTLZ1, DTLZ2, DTLZ3, DTLZ4, and DTL7.*

<sup>1</sup> From informations given in [25], it is believed that the problem denoted DTLZ6 in [9] is in fact the DTLZ7 problem.

*An Immune Multiobjective Optimization with Backtracking Search Algorithm Inspired… DOI: http://dx.doi.org/10.5772/intechopen.100306*

#### **Figure 2.**

*Statistics box plots of NSP obtained from 1000 independent runs of benchmark test problems ZDT1, ZDT2, ZDT3, ZDT4, ZDT6, DTLZ1, DTLZ2, DTLZ3, DTLZ4, and DTL7.*

NIGD's statistical results show that the NNIA algorithm is better than NNIA-X for the problems addressed. We also observe the efficiency of the NNIA+X1 and NNIA +X2 algorithms compared with NNIA with the exception of the difficult DTLZ4 test problem. For ZDT4 problem, NNIA+X3 is lower than NNIA+X1 and NNIA+X2. But we can always notice that our proposed algorithm NNIA+X3 remains superior to NNIA for the problems treated. Except for the two issues ZDT4 and DTLZ4, NSP shows the superiority of NNIA over NNIA-X. In all treated cases, NNIA+X1, NNIA +X2, and NNIA+X3 appear to be equal to or greater than NNIA. But for all these algorithms, the DTLZ4 problem seems to be the most difficult, since there are runs for which the Pareto front is approximated by a single, unique, point.

**Table 1** shows the percentage of results showing a single point for the Pareto front of the DTLZ4 test problem when a sequence of 1000 runs is performed with


#### **Table 1.**

*Percentage of results exhibiting a single point for the Pareto front of the DTLZ4 test problem when 1000 runs are carrying out.*

each algorithm. Generally, we conclude that NNIA+X3 retains better population diversity, and its convergence is faster than NNIA for these two and three objective test problems.

## **6. Experiments on the 10 bar truss design problem**

In this section, we address the multi-objective sizing optimization of truss-like structures which is a continuous subject of researches in structural design [28–30].

#### **6.1 Problem formulation**

In this study, we consider the 10 bar truss test, ketch in the **Figure 3**. Two objective functions have to be minimized: the mass and the displacement; and one objective function has to be maximized: the first flexible natural frequency of the structure.

Denoting **x**∈ Ω the vector of the topological and sizing optimization parameters, such that 0 ≤*xi* ≤ 1 for *i*∈ f g 1, … , *n* where *n* ¼ 10 is the number of elements, the three individual objectives are:

1.The mass *w* of the structure

$$w(\mathbf{x}) = \sum\_{i=1}^{n} \rho A l\_i \mathbf{x}\_i,$$

where *li* is the length of the *<sup>i</sup>*-th element, *<sup>ρ</sup>* <sup>¼</sup> 2768 kg/m<sup>3</sup> is the density of the material and *<sup>A</sup>* <sup>¼</sup> <sup>0</sup>*:*01419352 m<sup>2</sup> is the element cross-section area.

2.The maximum displacement *u* of the structure

$$u(\mathbf{x}) = \max\left(\mathbf{u}^\* = \arg\min\_S \left(\frac{1}{2}\mathbf{u}^T\mathbf{K}(\mathbf{x})\,\mathbf{u} - \mathbf{u}^T\mathbf{F}\right)\right),$$

where:

• **F** is the vector of loads

**Figure 3.** *Sketch of the 10 bar truss.*

*An Immune Multiobjective Optimization with Backtracking Search Algorithm Inspired… DOI: http://dx.doi.org/10.5772/intechopen.100306*

• **K** is the stiffness matrix of the finite element (FE) model, having the Young' modulus *E* ¼ 68*:*95 GPa.

The set S refers to the kinematic admissible space, *i.e.* the one that satisfies the imposed boundary conditions given by the supports while carrying all the prescribed loads, where *P* ¼ 448*:*2 kN.

3.The function (minimum flexible natural frequency *f* ) to maximize it

$$-f(\mathbf{x}) = -\min\left(\frac{1}{2\pi}\boldsymbol{\boldsymbol{\alpha}}^\*\right),$$

$$\text{where}: \quad \left\{\boldsymbol{\boldsymbol{\alpha}}^{\*2}, \mathbf{u}^\*\right\} = \arg\min\_{\mathbf{u}\in\mathcal{S}} \left(\boldsymbol{\boldsymbol{\alpha}}^2 = \frac{\mathbf{u}^T \mathbf{K}(\mathbf{x})\,\mathbf{u}}{\mathbf{u}^T \mathbf{M}(\mathbf{x})\,\mathbf{u}}\right), \quad ||\mathbf{u}|| \neq 0.$$

where **M** is the mass matrix of the FE model<sup>2</sup> .

Moreover, this MO problem is subjected to constraints for the mechanical stress *σ<sup>i</sup>* for each element *i*:

$$|\sigma\_i(\mathbf{x})| \le \overline{\sigma} \qquad i \in \{1, \dots, n\}.$$

where *σ* ¼ 172*:*4 MPa is the yield strength.

As designs with local rigid body modes or kinematic modes are not of interest, constraints are added to the MO problem formulation:

$$\frac{|\sigma\_i(\mathbf{x})|}{\overline{\sigma}} > \varepsilon, \qquad i \in \{1, \dots, n\} \\ \text{such that} \\ x\_i > 0$$

where *ε* ¼ 0*:*001.

Since the optimal Pareto front is unknown for this problem, unnormalized metric indicators are to assess for the MO algorithm performance. Thus, in practice, we introduce an *a priori* scaling of the three objectives, by defining:

$$f\_1(\mathbf{x}) = \frac{w(\mathbf{x})}{7,000}, \qquad f\_2(\mathbf{x}) = \frac{u(\mathbf{x}) - 0.016}{20}, \qquad f\_3(\mathbf{x}) = \frac{22,500 - \left(2\pi f(\mathbf{x})\right)^2}{5,000}$$

Moreover, in order to handle constraints of this MO problem, we use the penalty method. This technique consists of replacing the constrained optimization problems by an optimization problems without constraints, when introducing new objective functions to be optimized:

$$
\phi\_k(\mathbf{x}) = f\_k(\mathbf{x}) + r\rho(\mathbf{x}) \tag{13}
$$

where the penalty function chosen here is:

$$\rho(\mathbf{x}) = \sum\_{i=1}^{n} \left( \max \left\{ 0, \frac{|\sigma\_i(\mathbf{x})|}{\overline{\sigma}} - \mathbf{1} \right\} \right)^2 + \sum\_{i=1}^{n} \left( \max \left\{ 0, \varepsilon - \frac{|\sigma\_i(\mathbf{x})|}{\overline{\sigma}} \right\} \right)^2 \tag{14}$$

and where *<sup>r</sup>* is a positive penalty parameter. We have chosen here *<sup>r</sup>* <sup>¼</sup> <sup>10</sup>10.

<sup>2</sup> To obtain the best numerical efficiency for the FE analysis, the FE disassembly strategy proposed in Ref. [31] is involved.

Finally, the MO problem definition for the 10 bar truss of this work is:

$$\min\_{\mathbf{x}\in\Omega} \left( f\_1(\mathbf{x}) + r\rho(\mathbf{x}), f\_2(\mathbf{x}) + r\rho(\mathbf{x}), f\_3(\mathbf{x}) + r\rho(\mathbf{x}) \right)$$

#### **6.2 Numerical simulations for two objective functions**

In this subsection, we will subdivide and transform the 10 bar MO problem from the previous section into three MO subproblems. Objective functions are considered two by two: ð Þ *w*, *u* , ð Þ *w*, *f* , and ð Þ *u*, *f* To solve each of these 10 bar MO problems, we use the NNIA algorithms and the NNIA+X3, keeping the parameters to those of the previous subsection 5.2.

After 250 and 750 iterations, we obtain the two **Figures 4** and **5** (respectively), which show Pareto fronts of a typical execution, if the two algorithms start from the same initial population. In these figures, we observe that the NNIA+X3 algorithm shows better diversity for each subproblem, and that NNIA+X3 gives better convergence for the subproblems ð Þ *w*, *f* and ð Þ *u*, *f* . Since each iteration of one of these algorithms requires *nc* ¼ 100 evaluations of the mechanical problem, 25,000 function evaluations are performed when 250 iterations are performed, and 75,000 function evaluations are performed when 750 iterations are performed.

**Figure 6** shows the evolution of two metric indicators along the number of iterations for one typical run. Metric indicators chosen here are spacing and hypervolume of Pareto fronts. Spacing evolution is presented in log-log scale in the figure. Each evaluation of the hyper-volume is achieved by using the same anti-utopia point and utopia point for results consistency. Moreover, in order to compare the three MO results on the same graph, a relative hyper-volume is plotted: the graph corresponds to the hyper-volume obtained divided by its maximum value. These graphs show a better diversity and convergence for NNIA+X3 compared with NNIA when early number of iterations are considered.

#### **Figure 4.**

*Pareto fronts of the 10 bar truss MO problem two by two: w*ð Þ , *u (up-left), w*ð Þ , *f (up-right), u*ð Þ , *f (down), after 250 iterations.*

*An Immune Multiobjective Optimization with Backtracking Search Algorithm Inspired… DOI: http://dx.doi.org/10.5772/intechopen.100306*

#### **Figure 5.**

*Pareto fronts of the 10 bar truss MO problem, after 750 iterations for two by two: w*ð Þ , *u (up-left), w*ð Þ , *f (upright), u*ð Þ , *f (down).*

#### **Figure 6.**

*Metrics indicators of the 10 bar truss MO problem for the three objectives functions two by two: spacing (up) and relative hyper-volume (down).*

Tendency observed in the previous figure is confirmed by statistical results of **Figure 7** and **Figure 8**. These figures show box plots statistic for spacing and hyper-volume (respectively) when 300 runs stopped at 250 iterations are carried out. For spacing, means and variance are clearly better for NNIA+X3. Hyper-volume statistic results are also better for the NNIA+X3 when considering the ð Þ *w*, *u* and ð Þ *u*, *f* subproblems, while they are almost identicals for the ð Þ *w*, *f* supbproblem, although the mean and variance are also better for the NNIA+X3. From results for the hyper-volume of the ð Þ *w*, *f* supbproblem, it is assessed that this subproblem is the most difficult to solve since a wide spread is observed in data for both algorithms.

**Figure 7.**

*Statistics box plots of spacing for 300 runs of the two-by-two MO 10 bar subproblems: w*ð Þ , *u (left), w*ð Þ , *f (middle), u*ð Þ , *f (right).*

**Figure 8.**

*Statistics box plots of relative hyper-volume for 300 runs of the two-by-two MO 10 bar subproblems: w*ð Þ , *u (left), w*ð Þ , *f (middle), u*ð Þ , *f (right).*

#### **6.3 Numerical simulation for three objective functions**

**Figure 9** shows different views of the Pareto front obtained for one typical run when solving the three objectives 10 bar truss problem, using NNIA+X3 with the following parameter values: size of active population 30, clonal scale 150 and 750 iterations. In this case, the size of the dominant population is not limited to any number and all Pareto points found are kept. **Figure 10** shows the evolution of the

#### **Figure 9.**

*Four different views of the Pareto front obtained for solving the three objective functions of the 10 bar truss problem; Colorized surface of the down-right subfigure is added for a better visualization, and the color corresponds to the frequency objective f .*

*An Immune Multiobjective Optimization with Backtracking Search Algorithm Inspired… DOI: http://dx.doi.org/10.5772/intechopen.100306*

**Figure 10.**

*Evolution of results for a typical run of the MO 10 bar problem: Number of points for the Pareto front (left), spacing (middle), and relative hyper-volume (right); Blue line with cross markers: NNIA; Red line with squared markers: NNIA+X3.*

**Figure 11.**

*Statistics box plots for 300 runs of the three objectives 10 bar problem with NNIA and NNIA+X3 with random initial population: Number of Pareto front points (left), spacing (middle), and relative hyper-volume (right).*

number of points in the dominant population for the Pareto front given in **Figure 9**. It ends at 2216 Pareto points for this run.

**Figure 11** shows box plot statistics when 300 runs are carried out with NNIA and NNIA+X3. It is observed that the number of points for the Pareto front is higher for NNIA+X3, with a better spacing. But the hyper-volume is better for NNIA. Detailled analysis of results has revealed that bad results for hyper-volume are due to a slow convergence to an extreme Pareto front point: the individual optima for the frequency objective. For this problem, the individual minima found for the frequency objective are most of the time better for NNIA than for NNIA+X3. However, it is also found that individual minima of the three objectives are rarely found in the Pareto front by both algorithms.

For better results, the idea is to handle the three individual minima found by a mono-objective optimization into the random initial population of both NNIA and NNIA+X3. This simple modification greatly improves performance results. **Figure 12** shows statistics box plots when 300 runs of MO problem are carried out when the three individual optima are given in the initial population. In such a situation and for each of the 300 runs done, NNIA+X3 appears to be superior to NNIA for all performance aspects, including the computed hyper-volume.

#### **Figure 12.**

*Statistics box plots for 300 runs of the three objectives 10 bar problem with NNIA and NNIA+X3 when individual optima are handled in the initial population: Number of Pareto front points (left), spacing (middle), and relative hyper-volume (right).*
