In terms of solving problems, any thing or guideline that guides towards solving a problem, and not an instance of a problem is called an "Algorithm". Put it this way, any steps you follow in your brain to calculate a division (choosing my words very carefully), is an algorithm, because division is a problem, but say, dividing 21 by 7 or 100 by 30 are problem instances, and the exact steps you follow to solve these problems, are instances, not algorithms.

Now since we're clear on that, let's talk about classical problems. These are problems that classical set theory and logic and therefore classical algorithms can solve. But some problems, especially optimization problems, which are usually finding a maximum or minimum are almost impossible to be solved using classical methods, usually because they are so slow. In such cases, we use methods that are a tad Smarter or more efficient. For instance, let's say you have a document with 2 million pages, if you look it up for a phrase or a specific page or paragraph, and you use the classic methods which Guarantee success at some point, and definitive success, it would take 2 million times N, N being the time it takes to look through one page to find what you're looking for. But if you had multiple people each randomly checking a page, they'd actually have a better shot since their chances of success improves as they go forward, since they get an idea which way to go, or what to look at next. This is a method or an algorithm called the Genetic Algorithm and algorithms based on this method are called Genetic algorithms.

As long as we're clear on what needs to be done, let's look at another example. Say you have a function that is multi-dimensional, continuous and hard to compute, and you're looking for the maximum or minimum of said function. For this particular example, let's say maximum. Now, because it's hard to compute, you can't visualize it, you can't look through every possible input since it's continuous and if you miss as much as one point or input, you'd be making a mistake by classical problem solving standards. So what GA proposes is we treat the problem like life, and we generate a number of inputs or points within the domain of the function, and we call them a GENERATION, and these points each have specific values, for instance if your function is 3 dimensional, each point would have a value for x, one for y and one for the z axis, so we call those it's DNA, or genes if you will, and based on those genes and their outcome we calculate a percentage of fitness for the point. Now it depends on lots of measures, but for now, let's just say we have a fitness function and we generate let's say 10 random points on the function F, and each have a fitness level or percentage. Based on their fitness percentage, the likelihood of them being picked increases, just like in life, those who can adjust to their environment have more likeliness to survive. So if one point has a fitness level of 83 percent, it means that it has a 83 percent chance of being picked. So we pick 2, they "mate" and generate a child that receives half its genes from parent 1 and half from parent 2, and has a likeliness to be mutated as well. Afterwards, this child is kept, and then 2 more parents are picked and the process is repeated until we have 10 children, and we replace them with the old generation. Now these children have new fitness values, and they mate just like their parents and generate new children. The process is repeated from that point.

Now if we know to a certain point what our objective is, or we get to a certain member of the population or child with a 100 percent fitness level, we're done. If not, then we reach a point where we don't reach better results. That is when we simply assume the best value or member or child our objective or in this case maximum.

The challenges that we face are methods of generating the children, the computation of the fitness level and the probability of mutation which is a constant between 0 and 1, where it's usually set to less that 0.01 which is 1 percent probability of a gene being mutated. One of the more common generation methods is called Crossover, and it takes exactly half of the genes from one parent and the other half from the other parent. It depends somewhat on the function, the mutation probability usually depends less, but the one thing that depends directly on the function and your objective is the fitness function. For instance, if you're searching through the 2 million page document I mentioned, you probably know the objective which is a particular word, sentence, phrase or even whole page. In those cases your fitness function is very simple.

I'm working on GA as an extracurricular activity for my GA class and AI class, and I just find this topic very interesting since it's one directly under AI.