**2. Optimization**

Optimization is of great importance for the engineers, scientists and managers and it is an important part of the design process for all disciplines. The optimal design of a machine, the minimum path for a mobile robot and the optimal placement of a foundation are all optimization problems.

A constrained optimization problem has three main elements; design variables, constraints and objective function/functions. Design variables are independent variables of the

Heuristic Optimization Algorithms in Robotics 313

has been a great deal of interest on the applications of heuristic search algorithms to solve

The main difficulty encountered in the solution of the optimization problem is the local minimums. If there are a lot of local minimums, then both exact methods and heuristics can be trapped in to the local minimums. Since most of the heuristic algorithms developed a strategy to avoid the local minimums, they have been quite popular recently. The local minimums and global minimum of an objective function with one design variable are seen

Optimization studies drawn attention in 1960's. It was developed a lot of algorithm based on the more sophisticated mathematical background. These algorithms were called exact methods. But, exact methods produced certain solutions only for the limited scope of application. As a result, attention turned to heuristic algorithms. (Fisher et al., 1998). Heuristics algorithms try to find acceptable solutions to the problems using some heuristic knowledge and most of them simulate real life. They use not only pure mathematics, but also algorithms using with basic formulations. While the most important property of heuristic algorithms is that they are designed for the unconstrained optimization problems, they can also be adapted to the constrained optimization problems. The algorithms are designed to find the minimum value of the objective function within the bounds of the constraints . If a solution doesn't satisfy the constraints, this solution is not acceptable, even if the value of the objective function is minimum. So, if there are constraints in a minimization problem, penalty function is added to objective function. The total function is called as fitness function. Namely, if constraints are in feasible region, then there is no penalty and penalty function is equal to zero. If the constraints are not in feasible region, the

Heuristic algorithms don't guarantee finding the optimal solutions. They try to find acceptable solutions near to optimum in a reasonable time. These algorithms were studied for both discrete and continuous optimization problems since the 1970's. Researchers have tried to develop adaptive and hybrid heuristics algorithms. By this aim, tens of algorithms were developed in last decade. Heuristic algorithms can be classified as single solution

the such kind of problems.

Fig. 2. An objective function with local minimums

fitness function is penalized by penalty function.

**3. Heuristic algorithms** 

in the Figure 2.

objective function and can take continuous or discrete values. The ranges of these variables are given for the problems. Constraints are the functions of design variables and limit the search space. The objective function is the main function dependent on the design variables. If there is more than one objective function, the problem is called multi-objective optimization problem.

Solving an optimization problem means finding the values of the design variables which minimize the objective function within given constraints. So if the objective function is a maximization problem, it is converted to minimization problem multiplying by -1 as seen in the Figure 1. The mathematical model of an optimization problem is given by equation 1.

$$\begin{aligned} \text{g}\_{\text{i}}(\mathbf{x}\_{1}\,\mathbf{x}\_{2}\ldots\mathbf{x}\_{\text{n}},\mathbf{a}) \le 0, \text{ i} = \text{1,2,...,l} \\ \text{Minimize } \mathbf{f}\_{\text{obj}}(\mathbf{x}\_{1}\,\mathbf{x}\_{2}\ldots\mathbf{x}\_{\text{n}}), \text{ Subject to } \mathbf{h}\_{\text{j}}(\mathbf{x}\_{1}\,\mathbf{x}\_{2}\ldots\mathbf{x}\_{\text{n}},\mathbf{b}) = 0, \quad \text{j} = \text{1,2,...,m} \\ \mathbf{x}\_{\text{k}}^{\text{I}} \le \mathbf{x}\_{\text{k}} \le \mathbf{x}\_{\text{k}}^{\text{U}}, \qquad \mathbf{k} = \text{1,2,...n} \end{aligned} \tag{1}$$

Where**, x1, x2, …xn** are design variables, **n** is the number of design variables, **l** is the number of inequality constraints and **m** is the number of equality constraints, **a** and **b** are constant values. **xkl** and **xku** are respectively lower and upper bounds of the design variables.

Fig. 1. An objective function conversion to minimization

There are a lot of applications in decision science, engineering, and operations research which can be formulated as constrained continuous optimization problems. These applications include path planning for mobile robots, trajectory planning for robot manipulators, engineering design and computer-aided-design (CAD). Optimal or good solutions to these applications are very important for the system performance, such as lowcost implementation and maintenance, fast execution and robust operation (Wang, 2001).

There are different methods for the solution of the optimization problems. Exact methods and heuristics are two main solution methods. Exact methods find certain solutions to a given problem. But, if the problem size increases, the solution time of the exact methods is unacceptable, because of the fact that solution time increases exponentially when the problem size increases. So, using the heuristics methods for the solution of optimization problems are more practical. (Rashedi et al., 2009). On the other hand, in last decades there

objective function and can take continuous or discrete values. The ranges of these variables are given for the problems. Constraints are the functions of design variables and limit the search space. The objective function is the main function dependent on the design variables. If there is more than one objective function, the problem is called multi-objective

Solving an optimization problem means finding the values of the design variables which minimize the objective function within given constraints. So if the objective function is a maximization problem, it is converted to minimization problem multiplying by -1 as seen in the Figure 1. The mathematical model of an optimization problem is given by equation 1.

Where**, x1, x2, …xn** are design variables, **n** is the number of design variables, **l** is the number of inequality constraints and **m** is the number of equality constraints, **a** and **b** are constant

There are a lot of applications in decision science, engineering, and operations research which can be formulated as constrained continuous optimization problems. These applications include path planning for mobile robots, trajectory planning for robot manipulators, engineering design and computer-aided-design (CAD). Optimal or good solutions to these applications are very important for the system performance, such as lowcost implementation and maintenance, fast execution and robust operation (Wang, 2001). There are different methods for the solution of the optimization problems. Exact methods and heuristics are two main solution methods. Exact methods find certain solutions to a given problem. But, if the problem size increases, the solution time of the exact methods is unacceptable, because of the fact that solution time increases exponentially when the problem size increases. So, using the heuristics methods for the solution of optimization problems are more practical. (Rashedi et al., 2009). On the other hand, in last decades there

values. **xkl** and **xku** are respectively lower and upper bounds of the design variables.

gi12 (x ,x ,...x ,a) 0, i 1,2,...,l n h (x ,x ,...x , b) 0, n <sup>j</sup> 1,2,...,m j12

 

(1)

l u x x x , k 1,2,...n kkk

optimization problem.

Minimize f (x ,x ,...x ) obj 1 2 <sup>n</sup> , Subject to

Fig. 1. An objective function conversion to minimization

has been a great deal of interest on the applications of heuristic search algorithms to solve the such kind of problems.

The main difficulty encountered in the solution of the optimization problem is the local minimums. If there are a lot of local minimums, then both exact methods and heuristics can be trapped in to the local minimums. Since most of the heuristic algorithms developed a strategy to avoid the local minimums, they have been quite popular recently. The local minimums and global minimum of an objective function with one design variable are seen in the Figure 2.

Fig. 2. An objective function with local minimums
