**2.5 Results obtained with recurrent neural networks**

To find the best NN solution which would provide the most accurate results different layer recurrent architecture were also tested. Some of the results were presented in (Micu et al. 2011) but a more detailed study is given in the following.

A layer recurrent feed-forward neural network with one output layer and one hidden layer is considered (figure 11). The number of neurons in the hidden layer was varied from 5 to 30 with a step of 5 neurons. The transfer function of the output layer was set to *purelin* (the linear transfer function) and the transfer function on the hidden layer was varied between *tansig* (the

Artificial Intelligence Techniques Applied to

for both training and testing data sets.

sets.

Fig. 14. Absolute evaluation error for AmpLrnNN43 network.

solutions will be obtained for AmpLrnNN8 neural network.

Fig. 15. Absolute evaluation error for PhaseLrnNN8 network.

error obtained for both training and testing data sets.

Electromagnetic Interference Problems Between Power Lines and Metal Pipelines 265

The third possible solution (AmpLrnNN43) for the amplitude network has 5 neurons with *tansig* transfer function on the hidden layer and uses an *sse* performance evaluation function*.*  In this case the obtained average evaluation error for the testing data set is 0.47% with a maximum 1.26% achieved evaluation error. Figure 14 presents the evaluation error obtained

Comparing the results from figures 12, 13 and 14 it can be observed that the most accurate

Studying the maximum and average evaluation errors obtained for neural networks implemented to evaluate the phase of MVP, three NN architectures (PhaseLrnNN8,

The first one (PhaseLrnNN8) has 10 neurons with *logsig* transfer function on the hidden layer and uses a *mse* performance evaluation function. In this case the obtained average evaluation error for the testing data set is 1.16% with a maximum 4.21% achieved evaluation error. Figure 15 presents the evaluation error obtained for both training and testing data

The second possible solution (PhaseLrnNN19) for the amplitude network has 5 neurons with *tansig* transfer function on the hidden layer and uses a *msereg* performance evaluation function (ffNN2)*.* In this case the obtained average evaluation error for the testing data set is 1.19% with a maximum 3.02% achieved evaluation error. Figure 16 presents the evaluation

PhaseLrnNN19 and PhaseffNN44) were determined as possible optimal solution.

hyperbolic tangent sigmoid function) and *logsig* (the logarithmic sigmoid function). Also performance evaluation function was varied between *mse* (mean square error), *msereg* (mean square error with regularization performance) and *sse* (sum squared error).

Fig. 11. Implemented layer recurrent network architecture.

Comparing the average and maximum evaluation errors obtained for the testing data sets, in case of the neural network which evaluates the amplitude of MVP, three different NN architectures (AmpLrnNN8, AmpLrnNN19 and AmpLrnNN43) were determined as possible solutions.

The first one (AmpLrnNN8) has 10 neurons with *logsig* transfer function on the hidden layer and uses an *mse* performance evaluation function*.* In this case the obtained average evaluation error for the testing data set is 0.35% with a maximum 1.08% achieved evaluation error. Figure12 presents the evaluation error obtained for both training and testing data sets.

Fig. 12. Absolute evaluation error for AmpLrnNN8 network.

The second possible solution (AmpLrnNN19) for the amplitude network has 5 neurons with *logsig* transfer function on the hidden layer and uses an *msereg* performance evaluation function*.* In this case the obtained average evaluation error for the testing data set is 0.65% with a maximum 1.12% achieved evaluation error. Figure 13 presents the evaluation error obtained for both training and testing data sets.

Fig. 13. Absolute evaluation error for AmpLrnNN19 network.

hyperbolic tangent sigmoid function) and *logsig* (the logarithmic sigmoid function). Also performance evaluation function was varied between *mse* (mean square error), *msereg* (mean

Comparing the average and maximum evaluation errors obtained for the testing data sets, in case of the neural network which evaluates the amplitude of MVP, three different NN architectures (AmpLrnNN8, AmpLrnNN19 and AmpLrnNN43) were determined as

The first one (AmpLrnNN8) has 10 neurons with *logsig* transfer function on the hidden layer and uses an *mse* performance evaluation function*.* In this case the obtained average evaluation error for the testing data set is 0.35% with a maximum 1.08% achieved evaluation error. Figure12 presents the evaluation error obtained for both training and testing data sets.

The second possible solution (AmpLrnNN19) for the amplitude network has 5 neurons with *logsig* transfer function on the hidden layer and uses an *msereg* performance evaluation function*.* In this case the obtained average evaluation error for the testing data set is 0.65% with a maximum 1.12% achieved evaluation error. Figure 13 presents the evaluation error

square error with regularization performance) and *sse* (sum squared error).

Fig. 11. Implemented layer recurrent network architecture.

Fig. 12. Absolute evaluation error for AmpLrnNN8 network.

Fig. 13. Absolute evaluation error for AmpLrnNN19 network.

obtained for both training and testing data sets.

possible solutions.

The third possible solution (AmpLrnNN43) for the amplitude network has 5 neurons with *tansig* transfer function on the hidden layer and uses an *sse* performance evaluation function*.*  In this case the obtained average evaluation error for the testing data set is 0.47% with a maximum 1.26% achieved evaluation error. Figure 14 presents the evaluation error obtained for both training and testing data sets.

Fig. 14. Absolute evaluation error for AmpLrnNN43 network.

Comparing the results from figures 12, 13 and 14 it can be observed that the most accurate solutions will be obtained for AmpLrnNN8 neural network.

Studying the maximum and average evaluation errors obtained for neural networks implemented to evaluate the phase of MVP, three NN architectures (PhaseLrnNN8, PhaseLrnNN19 and PhaseffNN44) were determined as possible optimal solution.

The first one (PhaseLrnNN8) has 10 neurons with *logsig* transfer function on the hidden layer and uses a *mse* performance evaluation function. In this case the obtained average evaluation error for the testing data set is 1.16% with a maximum 4.21% achieved evaluation error. Figure 15 presents the evaluation error obtained for both training and testing data sets.

Fig. 15. Absolute evaluation error for PhaseLrnNN8 network.

The second possible solution (PhaseLrnNN19) for the amplitude network has 5 neurons with *tansig* transfer function on the hidden layer and uses a *msereg* performance evaluation function (ffNN2)*.* In this case the obtained average evaluation error for the testing data set is 1.19% with a maximum 3.02% achieved evaluation error. Figure 16 presents the evaluation error obtained for both training and testing data sets.

Artificial Intelligence Techniques Applied to

solved to obtain the induced AC voltage.

Fig. 18. Right of way configuration

**3.2 Neural network implementation** 

circuit IT.Sn102 type towers with one sky wire.

resistivities.

Electromagnetic Interference Problems Between Power Lines and Metal Pipelines 267

Comparing the results obtained with the implemented neural networks (figure 7÷10 and 12÷15) with those provided by the fuzzy logic block presented in (Satsios et al. 1999a, 1999b) (figure 5) we can observe a 50% or more accuracy increase in determining MVP amplitude

Once the magnetic vector potential is evaluated, the self and mutual impedance matrix, which describes the inductive coupling between the electrical power line and underground pipeline, can be evaluated using the relationships presented in (Christoforidis et al. 2003, 2005). After that the equivalent electrical circuit of the studied EPL-MP problem can be

**3. Self and mutual inductance matrix evaluation in case of a three layer earth**  Considering the accuracy of the results for the MVP obtained from the implemented neural networks, the authors started to develop a neural network solution to evaluate directly the self and mutual impedance matrix, which describe the inductive coupling. In this case a

An underground gas transportation pipeline runs in the same right of way with a 220kV/50Hz electrical power line (figure 18). In order to study more realistic problem geometries is considered a multilayer soil (three vertical layers) with different

The pipeline is considered to be buried at 2 m depth and having an 0.195 m inner radius, 5 mm thickness and 5 cm bitumen coating. The EPL phase wires are placed on triagonal single

In order to do not redo each time the finite element calculation when different problem geometry has to be studied, the authors had decided to implement a neural network solution to evaluate the self and mutual inductance matrix for any possible problem

geometries. As input values the following geometrical parameters has been selected:

and phase, depending on the implemented neural network architecture.

more complex EPL-MP interference problem had been chosen for study.

**3.1 Studied electromagnetic interference problem** 

Fig. 16. Absolute evaluation error for PhaseLrnNN19 network.

The third possible solution (PhaseLrnNN44) for the amplitude network has 10 neurons with *logsig* transfer function on the hidden layer and uses a *sse* performance evaluation function*.*  In this case the obtained average evaluation error for the testing data set is 1.18% with a maximum 3.58% achieved evaluation error. Figure 17 presents the evaluation error obtained for both training and testing data sets.

Fig. 17. Absolute evaluation error for PhaseLrnNN43 network.

Analysing the results shown in figures 15, 16 and 17 the authors concluded that the optimal layer recurrent NN architecture solution to evaluate the phase of the magnetic vector potential it is PhaseLrnNN19 network structure.
