**1. Introduction**

Optimization problems are common in engineering, and the problem must be optimized regarding one function (usually called as cost function). The cost function has several parameters used to model the optimization problem. Several methods have been proposed in the literature to optimize functions, and the method is selected according the cost function and parameters characteristics.

Deterministic methods usually have an initial point, called as seed. Using the gradient of the cost function, the optimization method starts on the initial point and converges to the optimal parameter value. The parameters are usually continuous [1]. However, engineering problems have problems which cannot be solved by this approach.

Some cost functions have several minimum, and the convergence strongly depends on the seed (or initial point). To overcome this issue, several metaheuristics were proposed, like Genetic Algorithms (GA) [2, 3], Particle Swarm Optimization (PSO) [4], Simulated Annealing (SA) [5], and others. Optimization methods based on meta-heuristics do not need the gradient information. For some problems, it is very hard to determine the gradient. This is particularly true for problems with discrete cost function. The existing meta-heuristics, usually, are focused on some specific type of parameters. For example, PSO can be applied to engineering problems with continuous parameters. GA can be applied to engineering problems with integer or fixed precision parameters.

SA was first proposed to solve combinatorial problems [5]. Later on, Corana et al. [6] extended the SA to incorporate continuous parameters. It was shown that the proposal made by Corana et al. [6] only converged to the global optimum in specific problems. Ingber [7] improved the SA to overcome this issue, however he used the cost function gradient for some calculations. Martins and Tsuzuki [8] proposed the SA with crystallization heuristic which does not use any gradient information.

This chapter shows the versatility of the SA with crystallization heuristic. Initially, the SA with crystallization heuristic is explained in Section 2. It was used to solve different problems related to Cutting and Packing, particularly the case with discrete cost function and cyclic continuous parameters [8–10] will be explained in Section 3. A third application is Topological Optimization (TO), it has a large number of parameters, like the EIT problem. This application is explained in Section 4. The curve interpolation problem, explained in Section 5, has two different types of parameters: continuous and integer. It is relevant to point that the SA with crystallization heuristic has been applied with success to interval based objective function evaluation [11–13]. The conclusions are in Section 6.

## **2. Simulated annealing with crystallization heuristic**

The SA is an optimization algorithm that uses a random local search and is able to find the global optimum solution. It was proposed by [5, 14], and it was inspired by the metal annealing process, which starts with a high temperature and reduces the temperature in small increments until reaching the minimum temperature. In each step, the new solution is accepted if it improves the current solution. Otherwise, it is only accepted if a probability factor *P T*ð Þ is more than a random number [10, 14, 15]. The *P T*ð Þ can be determined by

$$P(T) = \exp^{-\frac{\Delta E}{T}}\tag{1}$$

where *T* is the current temperature, and Δ*E* is the the energy state variation which is the difference between the objective function *F x*ð Þ of the new candidate *a* and the current solution *b*, as presented in Δ*E* ¼ *F a*ð Þ� *F b*ð Þ.

Algorithm 1 describes a possible implementation of a conventional SA. It initializes with a random solution and an initial temperature, which is an important parameter. There are two loops. The external loop controls the temperature. The temperature decreases according to a geometric cooling schedule with factor *α*, and it stops when the frozen state is reached. The cooling schedule, in this case the value of *α*, must be carefully chosen, as the convergence to the optimal distribution

*Versatility of Simulated Annealing with Crystallization Heuristic: Its Application… DOI: http://dx.doi.org/10.5772/intechopen.98562*

strongly depends on it. In the majority of the applications explained here, we used *α* ¼ 0*:*98. The internal loop, performs the random local search, until the thermal equilibrium is reached. The random search consists of modifying a single parameter and a new candidate *x*<sup>∗</sup> is obtained. The new objective function value is evaluated and compared with the cost of the current solution. Then, it is either accepted or rejected according to (1). The convergence conditions are usually controlled by algorithm parameters. They influence the quality of the final result as well as the speed of the algorithm.


The random search in the SA algorithm can use different strategies to improve finding a new solution. One of them is the SA with crystallization heuristic proposed by [9, 16]. The SA has two phases when the search is performed: exploration and refinement. **Figure 1** shows the two phases and their connection with the crystallization factor. The exploration of the domain space, usually happens in higher temperatures. The refinement of the solution happens in lower temperatures. There is no clear interface between the two phases.

For problems with combinatorial and integer parameters, the generation of the new candidate can be easily performed. In the case of combinatorial parameters, the algorithm exchanges the position of two parameters. In the case of integer parameters, the algorithm increments, decrements or keeps the current value. There are

**Figure 1.**

*The crystallization heuristic represented by SA controlling each parameter crystallization factor.*

**Figure 2.**

*During the optimization process, the SA has phases: exploration and refinement. In the exploration phase, the parameters have low crystallization and can perform larger jumps. The inverse happens in the refinement phase where the parameters have high crystallization.*

possibilities, considering that integer parameters have lower and upper bounds, it is possible to generate a new value performing a random selection in the range.

The control of continuous parameters is challenging. The crystallization heuristic proposed by [9, 16] considers a fixed maximum step, and the SA controls the probability density. **Figure 1** shows that wider probability density allows larger jumps with higher probability. While, thinner probability density allows smaller jumps with higher probability. However, a larger jump still possible to happen with very small probability. This feature allows the SA to escape from local minima.

Each continuous parameter has a crystallization factor, which is represented by *cj*. Considering the crystallization heuristic, the new candidate is determined by summing *cj* times a random number, this procedure determines a Bates distribution. It is represented by

$$
\omega^\* = \omega + \frac{1}{c\_j} \sum\_{k=1}^{c\_j} \text{random}\left(-\mathbf{1}/2, \mathbf{1}/2\right) \cdot \Delta r\_j \cdot e\_j. \tag{2}
$$

where *x* is the current solution, *x*<sup>∗</sup> is the new candidate, they are represented by vectors. Δ*rj* is the fixed step size associated with continuous parameter *j*. *ej* represents the selected parameter, *ej* is a vector with all elements equal to zero except one, the position associated with the *j*-th parameter. A common value assigned to Δ*rj* depends on the search interval, as Δ*rj* ¼ *max <sup>j</sup>* � min *<sup>j</sup>* � �*=*4. This value will provide enough exploration for the algorithm.

It remains to explain the procedure in which the SA controls the crystallization factor for each parameter. The SA modifies only one parameter at a time, and according the decision of accepting or rejecting the new candidate, an action is performed. The procedure is represented in **Figure 2**. If the new candidate is rejected, it is assumed that the SA is performing exploration and to increase accepted solutions the crystallization factor associated with this parameter is increased. The increase in the crystallization factor will reduce the probability of larger jumps for this parameter. On the other hand, if the new candidate is accepted, it is assumed that the SA is performing refinement and to increase exploration the crystallization factor associated with this parameter is reduced. The reduction of the crystallization factor will enlarge the probability of larger jumps for this parameter.

*Versatility of Simulated Annealing with Crystallization Heuristic: Its Application… DOI: http://dx.doi.org/10.5772/intechopen.98562*

**Algorithm 2**: SA with Crystallization Heuristic.

1: *i* 0; *c* ð Þ 1, 1, … , 1 2: *x* < random initial solution > 3: *T*<sup>0</sup> < initial temperature > 4: Δ*r* < range of the parameters > 5: **while** < Global condition not satisfied > **do** 6: **while** < Local condition not satisfied > **do** 7: *j* < select parameter to modify > 8: *<sup>x</sup>*<sup>∗</sup> *<sup>x</sup>* <sup>þ</sup> <sup>1</sup> *c j* P*<sup>c</sup> <sup>j</sup>* <sup>1</sup> *random* � <sup>1</sup> 2 , 1 2 � � � <sup>Δ</sup>*rj* � *<sup>e</sup> <sup>j</sup>* 9: <sup>Δ</sup>*<sup>E</sup>* <sup>¼</sup> *<sup>F</sup> <sup>x</sup>*<sup>∗</sup> ð Þ� *<sup>F</sup>*ð Þ *<sup>x</sup>* 10: **if** Δ*E*< 0 **then** 11: *<sup>x</sup> <sup>x</sup>*<sup>∗</sup> 12: **if** *cj* >1 **then** 13: *cj cj* � <sup>1</sup> <sup>▷</sup> Positive Feedback 14: **end if** 15: **else** 16: **if** *random*ð Þ 0, 1 <sup>&</sup>lt;*e*�Δ*E=Ti* **then** 17: *<sup>x</sup> <sup>x</sup>*<sup>∗</sup> 18: **if** *cj* > 1 **then** 19: *cj cj* � <sup>1</sup> <sup>▷</sup> Positive Feedback 20: **end if** 21: **else** 22: *cj cj* <sup>þ</sup> <sup>1</sup> <sup>▷</sup> Negative Feedback 23: **end if** 24: **end if** 25: **end while** 26: *i i* þ 1 27: *Ti Ti*�<sup>1</sup> ∗ *α* 28: **end while**

Algorithm 2 describes the implementation of SA with crystallization heuristic. In this algorithm, the crystallization heuristic adjusts the modification to increase the acceptance of new solutions. The crystallization factor *c* is initialized with 1 for all continuous parameters, and the fixed step size Δ*ri* is defined as 25% of the search range. For example, if the search happens between �100 and þ100, in this case Δ*ri* ¼ 50.
