**1.1. Simulated annealing**

Simulated annealing ([16], and by [14]) is a generic probabilistic meta-algorithm for the global optimization problem. SA is a robust general optimization method that is based on the work of [17]. It simulates the annealing of a metal, in which the metal is heated-up to a temperature near its melting point and then slowly cooled to allow the particles to move towards an optimal energy state. This results in a more uniform crystalline structure and so the process allows some control over the microstructure. SA has been demonstrated to be robust and capable of dealing with noisy and incomplete real-world data. This makes SA suitable for engineering applications.

SA is a variation of the hill-climbing algorithm. Both start off from a randomly selected point within the search space. Unlike in hill-climbing, if the fitness of a new candidate solution is less than the fitness of the current solution, the new candidate solution is not automatically rejected. Instead it becomes the current solution with a certain transition probability *p*(*T*). This transition probability depends on the change in fitness Δ*E* and the temperature T. Here temperature is an abstract control parameter for the algorithm rather than a real physical measure. The algorithm starts with a high temperature which is subsequently reduced slowly, usually in steps. On each step, the temperature must be held constant for an appropriate period of time (i.e. number of iterations) in order to allow the algorithm to settle into a thermal equilibrium, i.e. a balanced state. If this time is too short, the algorithm is likely to converge to a local minimum. The combination of temperature steps and cooling times is known as the annealing schedule, which is usually selected empirically. By analogy with metallurgical processes, each step of the SA algorithm replaces the actual solution by a randomly generated solution from the neighborhood, chosen with a probability depending on the difference between the corresponding function values and on a global parameter, so-called temperature - *T*. Temperature is decreasing during the process. The current solution changes almost randomly when *T* is large, but increasingly downhill as *T* goes to zero. The allowance for uphill moves saves the method from becoming stuck at the local minimum. Simulated annealing is a stochastic algorithm depending on parameters as follow:

$$SA = (M\_\prime \mathfrak{x}\_{0\prime} N\_\prime f\_\prime T\_{0\prime} T\_{f\_\prime} \mathfrak{a}\_\prime n\_\mathrm{T}),\tag{1}$$

meaning of parameters is:


In the real world any object consists of particles. Physical state can be described by vector **x** = (*x*1, *x*2, ..., *xn*,) describing particle position for example. This state is related to energy *y* = *f*(**x**). If such system is on the same temperature *T* long enough, then the probability of existence of such states is given by Boltzmann distribution. The probability that the system is in state **x** is then given by

$$\frac{e^{-f(\mathbf{x})/T}}{Q(T)}\tag{2}$$

with

2 Name of the Book

data were processed and used in order to obtain summarized results and graphs. The most significant results are carefully selected, visualized and commented on in this chapter.

Simulated annealing ([16], and by [14]) is a generic probabilistic meta-algorithm for the global optimization problem. SA is a robust general optimization method that is based on the work of [17]. It simulates the annealing of a metal, in which the metal is heated-up to a temperature near its melting point and then slowly cooled to allow the particles to move towards an optimal energy state. This results in a more uniform crystalline structure and so the process allows some control over the microstructure. SA has been demonstrated to be robust and capable of dealing with noisy and incomplete real-world data. This makes SA suitable for

SA is a variation of the hill-climbing algorithm. Both start off from a randomly selected point within the search space. Unlike in hill-climbing, if the fitness of a new candidate solution is less than the fitness of the current solution, the new candidate solution is not automatically rejected. Instead it becomes the current solution with a certain transition probability *p*(*T*). This transition probability depends on the change in fitness Δ*E* and the temperature T. Here temperature is an abstract control parameter for the algorithm rather than a real physical measure. The algorithm starts with a high temperature which is subsequently reduced slowly, usually in steps. On each step, the temperature must be held constant for an appropriate period of time (i.e. number of iterations) in order to allow the algorithm to settle into a thermal equilibrium, i.e. a balanced state. If this time is too short, the algorithm is likely to converge to a local minimum. The combination of temperature steps and cooling times is known as the annealing schedule, which is usually selected empirically. By analogy with metallurgical processes, each step of the SA algorithm replaces the actual solution by a randomly generated solution from the neighborhood, chosen with a probability depending on the difference between the corresponding function values and on a global parameter, so-called temperature - *T*. Temperature is decreasing during the process. The current solution changes almost randomly when *T* is large, but increasingly downhill as *T* goes to zero. The allowance for uphill moves saves the method from becoming stuck at the local minimum. Simulated

*SA* = (*M*, *x*0, *N*, *f* , *T*0, *Tf* , *α*, *nT*), (1)

, *T*� < *T*, usually *T*� = *α* × *T*. Parameter *α* is usualy in

annealing is a stochastic algorithm depending on parameters as follow:

• *Tf* : stopping temperature (temperature of crystallization)

• *nT*: number of iterations of Metropolis algorithm

**1.1. Simulated annealing**

engineering applications.

meaning of parameters is:

• *f* : The cost function • *T*0: initial temperature.

the range 0.8 - 0.99.

• **M**: space of possible solutions

• *N*(*x*, *σ*): Normal distribution

• *x*0: initial solution, randomly selected

• *α*: temperature reduction *α* : *T* → *T*�

$$Q(T) = \sum\_{\mathbf{x}} e^{-f(\mathbf{x})/T} \tag{3}$$

where summarization is going over all states *x*. For a sufficiently small *T* the probability that the system will be in state *xmin* with minimal energy *f*(**xmin**) is almost 1. In the 50's simulation of annealing was suggested by means of Monte Carlo method with a new decision function 4.

$$P(\mathbf{x} \rightarrow \mathbf{x}\_0) = \begin{cases} 1, & \operatorname{for} f(\mathbf{x}) < f(\mathbf{x}\_0) \\ e^{-(f(\mathbf{x}) - f(\mathbf{x}\_0))/T} \operatorname{for} f(\mathbf{x}) \ge f(\mathbf{x}\_0) \end{cases} \tag{4}$$

The function of this decision is whether new state *x* (when for example one particle will change its position) is accepted or not. In the case that *x* is related to lower energy then the old state it is replaced by a new one. On the contrary, *x* is accepted with probability 0 < P(**x** → **x**0) < 1. If *r* is random number from [0, 1], then new state is accepted only if *r* < P(**x** → **x**0). In (4) T has an important influence on probability *P*(*x* → *x*0) when *f*(*x*) ≥ *f*(*x*0); for big *T* any new state (solution) is basically accepted, for a low *T* states with higher energy are only rarely accepted. If this algorithm (Metropolis algorithm) is repeated for one state in a sufficient number of repetitions, then the observed distribution of generated states is basically Boltzmann distribution. This makes it possible to execute SA on a PC. The SA, repeating Metropolis algorithm for decreasing temperature then uses the final state *Tn* like initial state for the next iteration *xm* with *Tm* = *Tn* − *ε*. Variable *ε* is arbitrary small number. Pseudocode for SA is

Randomly selected initial solution *x*<sup>0</sup> from all possible solutions M ; *x*∗ := *x*<sup>0</sup> ; Set initial temperature *T*<sup>0</sup> > 0 ; *T* := *T*<sup>0</sup> ; Select decrement function *α*(*t*) and final temperature *Tf* ; **repeat for i := 1 to** *nT* **do begin** randomly select *x* from set of all possible neighbor *N*(*x*0) ; Δ*f* := *f*(*x*) − *f*(*x*0) ; **if** Δ*f* < 0 **then begin** *x*<sup>0</sup> := *x* ; move to a better solution is always accepted **if** *f*(*x*) < *f*(*x*∗) **then** *x*∗ := *x* update the best solution **end else begin** randomly select *r* from uniform distribution on the interval (0,1); **if** *r* < *e*−Δ*<sup>f</sup>* /*T*

**then** *x*<sup>0</sup> := *x* moving to a worse solution otherwise the current solution remains unchanged

```
end
     end
   T := α(T) ;
until T < Tf ;
x* is an approximation of the optimal solution
```