**2.2.2 Block feedback networks (BFN) framework**

A framework for describing recurrent neural networks that has been introduced by [Santini e.a., 1995b] provides a systematic way for modular design of networks of high complexity. This class of networks is called Block Feedback Neural Networks (BFN), referring to the *blocks* that can be connected to each other using a number of elementary connections. The term *feedback* is used because one of the elementary connections is a feedback connection, thus enabling the construction of recurrent neural networks. The network that results from the construction can in turn be considered a .block. and it can be used again as a basic building block for further construction of progressively more complex networks.

So a recursive, modular way of designing networks is provided. The training algorithm for any BFN is based on backpropagation training for MLP and the backpropagation through time (BPTT) algorithm for recurrent networks. It is recursively constructed along with the network structure. So the BFN framework introduces a class of (infinitely many) recurrent networks, which can be trained using a correspondingly constructed backpropagation algorithm. The basic unit is a single neural network layer, an example of which is shown in Fig 6.a. The corresponding matrix notation is shown in Fig 6.b. \$ is a 6-by-3 matrix and )(.) is a 6-by-6 diagonal mapping containing the neuron activation functions (of the form of equation 1). One such layer is defined as a *single layer block* N.( Zhenzhen Liu & Itamar Elhanany, 2008)

Fig. 6. a) example of a single network layer ; b) the layer in BFN block notation as a block N

This block computes the function:

96 Recurrent Neural Networks and Soft Computing

A framework for describing recurrent neural networks that has been introduced by [Santini e.a., 1995b] provides a systematic way for modular design of networks of high complexity. This class of networks is called Block Feedback Neural Networks (BFN), referring to the *blocks* that can be connected to each other using a number of elementary connections. The term *feedback* is used because one of the elementary connections is a feedback connection, thus enabling the construction of recurrent neural networks. The network that results from the construction can in turn be considered a .block. and it can be used again as a basic

So a recursive, modular way of designing networks is provided. The training algorithm for any BFN is based on backpropagation training for MLP and the backpropagation through time (BPTT) algorithm for recurrent networks. It is recursively constructed along with the network structure. So the BFN framework introduces a class of (infinitely many) recurrent networks, which can be trained using a correspondingly constructed backpropagation algorithm. The basic unit is a single neural network layer, an example of which is shown in Fig 6.a. The corresponding matrix notation is shown in Fig 6.b. \$ is a 6-by-3 matrix and )(.) is a 6-by-6 diagonal mapping containing the neuron activation functions (of the form of equation 1). One

Fig. 5. Recurrent multi-layer perceptron (RMLP) architecture with N layers

building block for further construction of progressively more complex networks.

such layer is defined as a *single layer block* N.( Zhenzhen Liu & Itamar Elhanany, 2008)

Fig. 6. a) example of a single network layer ; b) the layer in BFN block notation as a block N

**2.2.2 Block feedback networks (BFN) framework** 

$$\rho(n) = F(A.i(n))\tag{1}$$

Single layer blocks can be connected together using the four elementary connections shown in Fig. 7. They are called the cascade, the sum, the split and the feedback connection. Each of these connections consists of one or two embedded BFN blocks (these are called N1 and N2 in the figure) and one connection layer (which has the structure of a single-layer block). This connection layer consists of the weight matrices \$ and %, and the vector function )(.). Each of the four elementary connections itself is defined as a block and can therefore be used as the embedded block of yet another elementary connection.

Fig. 7. The four elementary BFN connections: the cascade (a), feedback (b), split (c) and sum (d) connection.
