**1.2 Neural network structure**

The computational Neural Network structures are based on biological neural configurations. The basic neural network is a model neuron, shown in figure 2, consisting of Multiple Inputs and a Single Output (MISO structure). Each input is modified by a *weight*, which multiplies the input value. The neuron combines these dendrite weight inputs and if the soma biological actions exceed a threshold value, then the nucleus in biological sense and activation function in computational actions, determines its output answer. In an electronic computational device as shown in Figure 3, a behavioral additional condition has the answer close to the real neuron actions.

Fig. 3. Neuron device computational model

Meanwhile understanding how an individual neuron operates many researches generate the way neurons organize themselves and the mechanisms used by arrays of neurons to adapt their behavior to external bounded stimuli. There are a huge number of experimental neural network computational structures, and actually laboratories and researchers continue building new neural net configurations.

The common computational neural net used, is called *back-propagation network* and is characterized with a mathematical structure model, which knows its behavioral stability conditions (bounded inputs and bounded output, BIBO conditions).

Intuitively it is built taking a number of neurons and arrays them forming a *layer*. A layer is formed having all inputs and nodes interconnected with others nodes, but not both within the same node. A layer finishes with a node set connected with a succeeding layer or outputs giving the answer. The multiple layers are arrayed as an input layer, multiple intermediate layers and an output layer as shown in Figure 4; where the intermediate layers do not have inputs or outputs to the external world and are called *hidden layers*.

Back-propagation neural networks are usually *fully connected*. This means that each neuron is connected to every output from the preceding layer.

Neuro-Fuzzy Digital Filter 77

The biological or computational fire answers correspond to threshold conditions that accomplish the excitation functions that permit generating an answer giving many inputs. Generally, the weights are selected intuitively in the first step; but with adaptive

The adaptation in a neural net means that it adapts its weights with a law action, seeking the convergence to the output desired answer. The difference between the desired ( ( ) *<sup>j</sup> d k* ) and actual responses ( ( ) *<sup>j</sup> y k* ) is known as convergence *error* ( ( ) *<sup>j</sup> e k* ) and is defined as (3) and is

The law action could be a sliding mode, proportional gain in its weight and other non-linear models that allows the neural net system converging to the desired answer with respect to

( ) ( ) ( ). *jjj ek dk y k* (3)

Fig. 5. Sigmoid Function as a neural net firing an electrical impulse noise.

consideration can be adjusted to seek the desired answer.

Fig. 6. Neuronal Weights Adjustment using a law action.

shown in figure 6.

the input set.

**2. Neural network adapting its weights usign fuzzy logic** 

Fig. 4. MISO Back-propagation Network with three layers.

The layers are described as: input, distributing the signals from the external world; hidden, categorizing the signals; and the output collecting all features detected and producing a response. However the problem of the layers has many descriptions considering the set of optimal weights.

#### **1.3 Neural network operation**

The output of each neuron is a function of its inputs and weights, with a layer as described recursively in (1).

$$W\_j(k) = \mu\_n(k) w\_{nj}(k) + W\_{j-1}(k) \,. \tag{1}$$

In where the basic function has the form 1 () () () *n j n ij i W k u kw k* .

The output neural net answer is a convolution operation shown in (2).

$$\left(y\_j(k) = \left(F(k) \circ \mathcal{V}(k)\right)\_j\right). \tag{2}$$

The ( ) *W k <sup>j</sup>* value is convoluted with a threshold value giving an approximate biological neural net answer, but in a computational sense it is active considering a ( ) *<sup>j</sup> t k* , known as an activation function. The activation function usually is the sigmoid function shown in Figure 5. The output answer ( ) *<sup>j</sup> y k* , is the neural net response, observing that the threshold function corresponds to biological electrical potential [90, 110] mill-volts needed in synopsis operations.

The layers are described as: input, distributing the signals from the external world; hidden, categorizing the signals; and the output collecting all features detected and producing a response. However the problem of the layers has many descriptions considering the set of

The output of each neuron is a function of its inputs and weights, with a layer as described

The ( ) *W k <sup>j</sup>* value is convoluted with a threshold value giving an approximate biological neural net answer, but in a computational sense it is active considering a ( ) *<sup>j</sup> t k* , known as an activation function. The activation function usually is the sigmoid function shown in Figure 5. The output answer ( ) *<sup>j</sup> y k* , is the neural net response, observing that the threshold function corresponds to biological electrical potential [90, 110] mill-volts needed in synopsis

1 () () () *n j n ij i W k u kw k* 

.

<sup>1</sup> () () () () *W k u kw k W k j n nj j* . (1)

() () () *<sup>j</sup> <sup>j</sup> y k Fk Wk* . (2)

Fig. 4. MISO Back-propagation Network with three layers.

The output neural net answer is a convolution operation shown in (2).

optimal weights.

recursively in (1).

operations.

**1.3 Neural network operation** 

In where the basic function has the form

Fig. 5. Sigmoid Function as a neural net firing an electrical impulse noise.

The biological or computational fire answers correspond to threshold conditions that accomplish the excitation functions that permit generating an answer giving many inputs. Generally, the weights are selected intuitively in the first step; but with adaptive consideration can be adjusted to seek the desired answer.
