BACK PROPAGATION NEURAL NETWORK DEVELOPMENT

Megan Guberski

For the past 150 years humanity has been fascinated with the idea of a non-human intelligence. Originally the intelligence was considered to be biological alien life, but since the 1950’s the idea of a non-biological Artificial Intelligence has gained popularity. Though a truly independent AI is still more fiction than fact scientists have developed methods that indicate AI is a conceivable goal of the near future. One such method is the use of neural networks, the promise of which lies in their ability to learn from example, just as humans do. In fact, neural networks are based on biological neurons and use a system of weighted connections to communicate with each other in place of the neurotransmitters used by biological neurons.

Over the summer a simple neural network was developed, composed of three layers; input, middle and output, with weighted connections between input and middle layer neurons and between middle and output layer neurons. As the neurons in the middle layer had no contact with the ‘outside’ world, the layer was referenced as the hidden layer. Input presented to the network was passed through the weight connections between the input and hidden layers. The total input to each hidden layer neuron was then summed and passed through a non-linearity sigmoid function, with the resultant scalar being passed through the

hidden to output weight connection, becoming input to the output layer neuron. At the output layer neuron the inputs were again summed and put through the sigmoid function, the resulting value became the network’s output.

Once the network was developed, supervised teaching using the back propagation algorithm was implemented. The algorithm, developed by (developers name) finds the squared error between the actual and desired output, then works backwards through the network to calculate how each weight contributed to the error. The weights are then changed by a specified amount and the process is repeated until the network has a sufficiently small error.

Using the back propagation algorithm I was able to program a network that learned to handle the XOR logic problem, as well as a network that recognize the number of ones used in a binary sequence. The next step, undertaken as a special studies during the school year will be to expand the simple back propagation network into a recurrent network that is able to handle a temporal element.

Advisor: Judy Franklin