**3. Artificial Neural Network**

Artificial Neural Network (ANN) is the replica of the actual neural network system and there working principle is quite similar to the biological neural network [11]. Outer nodes collect the input responses. The other nodes are inter-connected, and finally, nodes give the responses to the input responses. In a neuron, there are synapses, they multiply each input by the weighted value, only if this value receded the threshold value, then these responses transfer to the next neuron. The interconnected inner layer of the neuron is known as a hidden layer. Eq. (3) shows the input calculation for each hidden layer of the neuron.

$$\mathbf{II}\_{i} = \sum\_{i=1}^{n} \mathbf{W}\_{i} \mathbf{X}\_{i} \tag{3}$$

(2)

The output responses are defined by sigmoid function as shown in Eq. (4).

$$\mathbf{O}\_{i} = \mathbf{f}(I\_{i}) = \frac{\mathbf{1}}{\mathbf{1} + \mathbf{e}^{-I\_{i}}} \tag{4}$$

The working principle of an artificial neuron is shown in **Figure 1**. There are generally three types of architectures in case of Neural Network.


iteration and as a result the training has been terminated. It has been programmed

Best validation performance is 5.0748x10�<sup>7</sup> at epoch 3, which is a low prediction error measured with MSE shown in **Figure 5**. This graph does not show any main problems with the training. The validation and test curves are very similar to each

In this case, the values of R for training, testing and validation are 0.85099, 1 and

cessive iterations performed for validation checks is 2. **Figure 4** represents the

*Nature Inspired Metaheuristic Approach for Best Tool Work Combination for EDM Process*

1, respectively. So, from the **Figure 6** it is evident that the value of overall R is 0.77755. As the values of R for validation and testing both has the values more than

, the training will

. Here the number of suc-

in such a way that in case the gradient falls below 1.00x10�<sup>7</sup>

terminate. In this experiment the gradient is 5.01x10�<sup>8</sup>

neural network training performance progress.

*DOI: http://dx.doi.org/10.5772/intechopen.96725*

0.9 the training shows a good result.

*Training performance progress for MRR.*

other up to 3 epochs.

*Simulink diagram of ANN.*

**Figure 3.**

**Figure 4.**

**Figure 5.**

**141**

*Performance plot for MRR.*

**Figure 1.** *Schematic diagram of an artificial neural.*

#### **Figure 2.** *The architecture of general ANN.*

• Self-organizing-neural networks

**Figure 2** displays the general structure of ANN.

#### **3.1 Analysis of the experimental result**

In this research work, the analysis of the experimental results is analyzed by dividerand function, in using Matlab R2015a. The backpropagation method is used in this analysis because this backpropagation method gives the feedback while training the data for calculation. The individual responses are evaluated using the Simulink model in ANN considering all those five parameters. Also, ten number of hidden layers are used to optimize the responses. **Figure 3** shows the Simulink diagram used in ANN.

In ANN responses are trained then validated and finally tested to find out that is there any linear relationship between control parameters and responses. The continuous line is the best fit linear regression line of output versus targets. While the value of regression (R) is 1 represents the linear relation between control and response parameters.

#### *3.1.1 Analysis to maximize MRR*

For calculating 50%, 25% and 25% of the date have been used for training, testing and validation, respectively. MSE has been achieved after 5 successful

*Nature Inspired Metaheuristic Approach for Best Tool Work Combination for EDM Process DOI: http://dx.doi.org/10.5772/intechopen.96725*

**Figure 3.** *Simulink diagram of ANN.*

iteration and as a result the training has been terminated. It has been programmed in such a way that in case the gradient falls below 1.00x10�<sup>7</sup> , the training will terminate. In this experiment the gradient is 5.01x10�<sup>8</sup> . Here the number of successive iterations performed for validation checks is 2. **Figure 4** represents the neural network training performance progress.

Best validation performance is 5.0748x10�<sup>7</sup> at epoch 3, which is a low prediction error measured with MSE shown in **Figure 5**. This graph does not show any main problems with the training. The validation and test curves are very similar to each other up to 3 epochs.

In this case, the values of R for training, testing and validation are 0.85099, 1 and 1, respectively. So, from the **Figure 6** it is evident that the value of overall R is 0.77755. As the values of R for validation and testing both has the values more than 0.9 the training shows a good result.

**Figure 4.**

• Self-organizing-neural networks

*The architecture of general ANN.*

**3.1 Analysis of the experimental result**

diagram used in ANN.

**140**

**Figure 1.**

**Figure 2.**

*Schematic diagram of an artificial neural.*

*Computational Optimization Techniques and Applications*

*3.1.1 Analysis to maximize MRR*

**Figure 2** displays the general structure of ANN.

In this research work, the analysis of the experimental results is analyzed by dividerand function, in using Matlab R2015a. The backpropagation method is used in this analysis because this backpropagation method gives the feedback while training the data for calculation. The individual responses are evaluated using the Simulink model in ANN considering all those five parameters. Also, ten number of hidden layers are used to optimize the responses. **Figure 3** shows the Simulink

In ANN responses are trained then validated and finally tested to find out that is there any linear relationship between control parameters and responses. The continuous line is the best fit linear regression line of output versus targets. While the value of regression (R) is 1 represents the linear relation between control and response parameters.

For calculating 50%, 25% and 25% of the date have been used for training, testing and validation, respectively. MSE has been achieved after 5 successful

*Training performance progress for MRR.*

**Figure 5.** *Performance plot for MRR.*

**Figure 6.** *Regression plot of MRR.*

#### *3.1.2 Analysis to minimize TWR*

For calculating 55%, 25% and 20% of the date have been used for training, testing and validation, respectively. MSE has been achieved after 4 successful iteration and as a result the training has been terminated. It has been programmed in such a way that in case the gradient falls below 1.00x10�<sup>7</sup> , the training will terminate. In this experiment the gradient is 7.63x10�<sup>8</sup> . **Figure 7** represents the neural network training performance progress.

Best validation performance is 206859x10�<sup>5</sup> at epoch 4, which is a low prediction error measured with MSE shown in **Figure 8**. This graph does not show any main problems with the training. The validation and test curves are very similar to each other.


In this case, the values of R for training, testing and validation are 0.99999, 1 and 1, respectively. From the **Figure 9**, the value of overall R is 0.85855. As the values of

*Nature Inspired Metaheuristic Approach for Best Tool Work Combination for EDM Process*

*DOI: http://dx.doi.org/10.5772/intechopen.96725*

**Figure 8.**

**Figure 9.**

**143**

*Regression plot of TWR.*

*Performance plot for TWR.*

**Figure 7.** *Training performance progress for TWR.*

*Nature Inspired Metaheuristic Approach for Best Tool Work Combination for EDM Process DOI: http://dx.doi.org/10.5772/intechopen.96725*

**Figure 8.** *Performance plot for TWR.*

**Figure 9.** *Regression plot of TWR.*

In this case, the values of R for training, testing and validation are 0.99999, 1 and 1, respectively. From the **Figure 9**, the value of overall R is 0.85855. As the values of

*3.1.2 Analysis to minimize TWR*

*Training performance progress for TWR.*

**Figure 6.**

**Figure 7.**

**142**

*Regression plot of MRR.*

For calculating 55%, 25% and 20% of the date have been used for training, testing and validation, respectively. MSE has been achieved after 4 successful iteration and as a result the training has been terminated. It has been programmed in

Best validation performance is 206859x10�<sup>5</sup> at epoch 4, which is a low prediction error measured with MSE shown in **Figure 8**. This graph does not show any main problems with the training. The validation and test curves are very similar to each other.

, the training will termi-

. **Figure 7** represents the neural

such a way that in case the gradient falls below 1.00x10�<sup>7</sup>

nate. In this experiment the gradient is 7.63x10�<sup>8</sup>

*Computational Optimization Techniques and Applications*

network training performance progress.

R for validation and testing both has the values more than 0.9 the training shows a good result.
