A Quick Analysis of Heuristic Optimization by Stochastic Genetic Algorithms and Particle Swarm

I have been inspired by Varadi’s latest series of posts in http://cssanalytics.wordpress.com regarding Heuristic optimization techniques (particularly Genetic Algorithms, GAs, and Particle Swarm Optimization, PSO) to perform a quick analysis comparing said techniques. I highly recommend you read the http://cssanalytics.wordpress.com blog posts from Sept. 3-6, 2013, for a detailed explanation of GAs and PSO before viewing my analysis.

numanalysis.jpg
A Quick Analysis of Heuristic Optimization by Stochastic Genetic Algorithms and Particle Swarm
We investigate the optimization performance of both Genetic Algorithms (GAs) and Particle Swarm Optimization (PSO). Particularly we are interested in comparing the efficiency and accuracy of these techniques in optimizing a variety of well known test functions. The test functions used are outlined below.
We consider three classes of GAs:

  1. Matlab’s Built in Traditional GA (MatlabGA).
  2.  A traditional GA as outlined by Varadi in  http://cssanalytics.wordpress.com/ (GA)
  3.  A simplistic Micro-GA (mGA) as outlined by Varadi in http://cssanalytics.wordpress.com/.

A Brief Review of Traditional GAs
Recall, like most evolutionary algorithms, a GA is a stochastic sampling method that repeatedly refines a population or set of candidate solutions. The method is based on the Darwinian notion of “Survival of the Fittest” in which only the elite solutions, (those that return the most optimal objective function values) are kept, while sub-optimal solutions are discarded.

A typical GA begins by randomly generating an initial population candidate solutions, called the parents. The initial population of each generation is then enlarged through production of a combination of  cross-over children and/or mutation children. Cross-over children are derived by mixing and matching individual components of the parents, in what Varadi describes as Mating. Formally, at each component i of the child vector, the default crossover function randomly selects a value, x_i, from one of the two parents and assigns it to the i^{th} component of the cross-over child. Mutation children are created by randomly perturbing the parent solutions, often via a Gaussian distribution.

Next we evaluate the corresponding objective function value at each of the potential solution in the current population. Only a fraction of the individuals in the current population that provide the most optimal objective function value will continue onward to form the parent population of the next generation.

The mGA, on the other hand, is a simplified version of the traditional GA and involves a Tournament phase for determining which parent solutions are worthy to mate. Those parents that are worthy to mate produce cross-over children, as described above and in http://cssanalytics.wordpress.com/.

For consistency amongst the varying GAs we have kept each population size and maximum number of generations fixed at 100 and 100d, respectively, where d is the number of input dimensions to the test function. Convergence is established either by achieving the maximum number of generations or if the average of the relative difference between a generation’s solution and best solution found over the most recent 50 generations is less than a tolerance, tol=10^{-6}.

A Brief Review of PSO

PSO is a gradient free, stochastic optimization algorithm. Like GAs and mGAs, PSO begins by first generating an initial set of candidate solutions or particles, collectively called the swarm. At each iteration, individual particles of the swarm navigate the search-space based on each particles personal best position and the overall best position found by the swarm. Therefore, previous locations that minimize the objective function, f, act as attractors to which the swarm will migrate towards. The position of each particle, and the swarm alike, is defined in terms of its previous position, x, and the velocity, v, of the particle(or swarm). The velocity (or momentum factor) describes a particles tendency to move in a certain direction. The position and velocity are defined as:
x_{n+1}=x_n+v_n and
v_{n+1}=v_n+r_1(x_p-x_n) +r_2(x_s-x_n)

where r_1 and r_2 are random numbers on the interval (0,1), x_p denotes the best previous position of that particle, and x_s denotes the best previous position of the swarm.

For optimization of each test function I have chosen to use a swarm size of 20. For a fair comparison with the GAs, the stopping criteria for PSO is that the average relative difference of the best solution found over the most recent 50 iterations is less than a tolerance, tol=10^{-6}.

Lastly, note that since both the GA and PSO are stochastic optimization routines, the results presented below have each been averaged over 100 simulations.

Numerical Results 
We analyze three quantities:

  1. Accuracy: Average Percent Error = \frac{|f^*-f_{opt}|}{|f_{opt}|}\times 100\%, where f_{opt} is the true, known, global optimum value.
  2. Efficiency: Average number of Function Evaluations (FEs) required for convergence.
  3. General Performance Assessment:} GPA= FEs \times (\%Err)^2, in which accuracy holds twice the weight as efficiency and smaller GPA values indicate optimal performance.

Simulation results for each test function can be found in Table 1. From Table 1 we can draw the following conclusions:

  • In general MatlabGA outperforms both my own traditional GA and mGA, with the exception of optimizing the 2-D Hump Function and 6-D Hartmann function. This is not surprising given the level of sophistication of Matlab’s built in optimization algorithms as compared to the 1-day coding blitz of my GAs.
  • The mGA is indeed the most efficient of the GAs, as measured by the number of FEs required to establish convergence. Of course improved efficiency comes at the cost of decreased accuracy – ultimately the user’s optimization goals will play a huge factor in algorithm selection.
  • As expected, PSO outperforms all GAs for objective functions that are strictly convex, with minimal noise and a single global optimum.
  • As Varadi suggests, PSO (or a hybrid PSO method) may be the preferred technique for several portfolio optimization problems.
  • Objective surfaces such as the Schwefel Function and the Rastrigin Function are noisy and multi-modal, containing several local optima. For global optimization of both these functions, MatlabGA proves to outperform PSO.

Table1

Table2

For full test function details see:
http://www-optima.amp.i.kyoto-u.ac.jp/member/student/hedar/Hedar_files/TestGO.htm

Analysis of GAs and PSO

Advertisements
About

I am a recent (2012) Memorial University B.Sc. graduate in Applied Mathematics and Physics. I am currently working in the Mathematics & Statistics Department at MUN, under the supervision of Dr. Ronald Haynes. My current research is in numerical optimization, as it pertains to oil well placement and optimal parameterization. In my spare time I enjoy exercising, spending time with family and friends and playing soft rock and blues guitar.

Posted in Optimization Algorithms
2 comments on “A Quick Analysis of Heuristic Optimization by Stochastic Genetic Algorithms and Particle Swarm
  1. qkslvr says:

    Huummmmmm:

    Who wites this stuff and furthermore, who reads it!!!!!

    I know – you guys do. Thank god for people like you guys 🙂

    If You’re Lucky Enough To Live On The Beach, You’re Lucky Enough

    Sent From My iPad

  2. […] and detailed analysis of these approaches, I suggest readers look at a newcomer to the blogroll- Butler’s Math. Andrew Butler is a mathematician with a background in optimization and he does an interesting […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: