Abstract

In this paper we test a particular variant of the (Repulsive) Particle Swarm method on some rather difficult global optimization problems. A number of these problems are collected from the extant literature and a few of them are newly introduced. First, we introduce the Particle Swarm method of global optimization and its variant called the 'Repulsive Particle Swarm' (RPS) method. Then we endow the particles with some stronger local search abilities - much like tunneling - so that each particle can make a search in its neighborhood to optimize itself. Next, we introduce the test problems, the existing as well as the new ones. We also give plots of some of these functions to help appreciation of the optimization problem. Finally, we present the results of the RPS optimization exercise and compare the results with those obtained by using the Genetic algorithm (GA)and/or Simulated annealing (SA) method. We append the (Fortran) computer program that we have developed and used in this exercise.

Our findings indicate that neither the RPS nor the GA/SA method can assuredly find the optimum of an arbitrary function. In case of the Needle-eye and the Corana functions both methods perform equally well while in case of Bukin's 6th function both yield the values of decision variables far away from the right ones. In case of zero-sum function, GA performs better than the RPS. In case of the Perm #2 function, both of the methods fail when the dimension grows larger. In several cases, GA falters or fails while RPS succeeds. In case of N#1 through N#5 and the ANNs XOR functions the RPS performs better than the Genetic algorithm.

It is needed that we find out some criteria to classify the problems that suit (or does not suit) a particular method. This classification will highlight the comparative advantages of using a particular method for dealing with a particular class of problems.