Introduction

Neural Network is a powerful tool used in modern intelligent systems. Nowadays, many applications that involve pattern recognition, feature mapping, clustering, classification and etc. use Neural Networks as an essential component. In recent decades, several types of neural networks have been developed. Back Error Propagation, Kohonen feature map and Hopfield network are some of basic networks that have been developed and are used in many applications.

In this article general Multi Layer Perceptron or Back Error Propagation is presented. The sample application uses this Multi Layer Perceptron and classifies the blue and red patterns generated by user.

Network structure

Figure below shows a typical MLP with three layers of weights.

In a typical neural network, these objects exist: layers (usually 3 layers), some neurons in each layer and synapses between layers. Based on these objects, I developed appropriate classes.

Classes used in MLP

Neuron: This is the basic element of network, that has some members such as output, sum and delta.

How to use Sample Application

To use this program follow these steps:

Clear screen by pressing on the Clear button.

Define your network structure by putting appropriate values in H1, H2 and H3 edit boxes. These are the number of neurons in hidden layers. If some value was 0, that layer will not be included in network (by default the network has three layers: one input, one hidden and one output).

Put some blue and red points on screen by clicking on it. (Select color from combo box).

Now you can press “initialize” network button.

Set maximum number of iterations and press “learn” as more as the network correctly classifies all points on screen. You can see the error graph by clicking on “Show error graph”. The “smartgraph” component is used in this project and you must download and register it on your computer to see error graph correctly. It is available here.

Error values can be written into a text file, if you check in “Write errors to file”.

This program is fully object oriented and can be understood easily if you are familiar with Multi Layer Perceptron. At last, I used this class to develop a Farsi OCR application and it was working very well!.

Comments and Discussions

Please, would you like to help me to prepare my project handwritten recognition word or letters with TDNN (time delay neuronal network) ,a language is c# or .
Thank you very match
Contact me in:
Doc_dh23000@hotmail.com

SNN is a free software for playing with neural networks classification.
You can control more parameters.

I got some problems using your software, suddenly your application closed during learning.

You may consider remove 'initialize network' feature, and add it by default to Learn button.
If it is not good for you, you can make two buttons: Learn (with initialization), and Learn more (without init.).
It will be much easier to use it, just one click to start.
Also, more points at start is better idea. Neural networks likes more learning data.

Thats very simple. in fact its the sme as what I done in my demo application, unless you must get inputs from text file rather than screen. assume red points are class -1 and blue points are class +1. other things are the same. good luck.

Hello all
to use this neural network with some real data, there is a free dataset of handwritten farsi digits at this location:http://www.farsiocr.ir/farsi-digit-dataset[^]
also you can directly contact me(HosseinKhosravi@gmail.com) I will guide you how to access the whole dataset of Farsi digits. This dataset is free for non commercial use.It has more than 102,000 handwritten Persian digits.
GoodLuck;P

Hello,
I'm trying to use your code for an OCR application. I'm facing two problems.
1. I cannot converge to less than 0.15 error
2. After training, I always get the same output value set whatever input set I apply.
Have you changed or added something to the NN classes you have used here when you made the Farsi OCR application?

This is absolutely kool, Andrew. I would like to ask you 2 questions.
1. I would like to reuse the code of NN to train a different dataset which has both nominal and numerical attributes. Is it possible your NN can accept nominal inputs?
2. Is that true that I need only to replace your training data generation by another dataset? And the NN would work just fine?
Thanks
Patrick

i want to use nural network for recognize 3 sound command but i cannot create nural network like this ,can i use this article to write a program for my purpose ?
it is very important for my program about knitting persian rugs ,for now i recogniz sound frequencies by fourier transform (FFT) but i have forced to recognize only frequencies..

Thanks for the article, most interesting. One suggestion: You mention using your backprop system to do OCR for Farsi - it would be great if you could elaborate/describe this process in more detail. This is exactly the kind of discussionthat gets missed from most articles/books. For example how do you format your data ready for input to the MLP and how do you check the recogniser is working correctly ?

at first I extract letters from some handwritten forms, then I did normalization, at last the normalized image(64*64 pixel) was segmented into 8*8 block and gray level average in each block was the value send to each input neuron(64 neuron was at input)... of course there are another features that can be used but the scope of this article is not so that I can describe all my work . but its interesting that by this easy feature I reached 96% correctness on my training data(12800 letters) and 88% correctness on my test data(3200 letters)

Khosravi, do you remember me???
I am Damon Chitsaz, your old highschool friend! Remember Javad(Reza Goodarzi??)
It has been so long! A weird coincident to meet you here!
What is your email address??
Mine is damoncz@hotmail.com
Contact me!

You code is excellent. It is efficient and easy to understand, provided that the reader knows the teory behind back error propagation neural nets. The program is also very nice, with appealing graphics.
However your article doesn't explain well the theory behind the code. For those willing to understand how the neural nets work I suggest:
http://www.generation5.org/articles.asp?Action=List&Topic=Neural+Networks
It has lots of articles covering back propagation and kohonen neural nets.