5.2. History and intuition of neural networks

In the twentieth century scientists were able to definitively identify the primary unit of the brain – the neuron. One theory of the time was that information is not pre-loaded in the brain of a newly born child, only the basic structure and connections between the neurons in its brain exist, the brain learns to function by strengthening/weakening various neural pathways.

<sup>1</sup> Machine Learning, Tom Mitchell, McGraw Hill, 1997.

Therefore it was theorized that the neuron, which either fires or does not, can be modeled as a function that has multiple inputs, a single output (which may be connected to several other neurons) and only 'fires' when the sum of inputs exceeds a certain threshold. The connections between neurons have weights, which strengthen or weaken a connection between two neurons.

"Neurons that wire together, fire together"


The above quotation may be familiar. It is based on Donald Hebb's theory to explain neural adaptation and learning and this forms the basis of learning in modern-day artificial neural networks.
