Lab 10: Experimenting with Genetic Algorithms

In this lab you will learn how to use Pyro's implementation of
various evolutionary algorithms. You will then perform a series of
experiments to better understand how the parameters of a GA affect its
ability to find solutions. You will also compare a GA to a
simpler hill climbing method. These exercises are adapted from
Melanie Mitchell's *An Introduction to Genetic Algorithms*.

- There is a collection of materials available online to help you
learn how to use Pyro's evolutionary algorithms. Read through the
first half of
PyroModuleEvolutionaryAlgorithms. You can move on to the
experiments below once you have reviewed the section on
*GA Implementation Details*. **EXPERIMENT 1:**You will do a series of experiments for a problem, varying the mutation and crossover rates, to see how the variations affect the average time required for the GA to find the optimal string. Use the GAMaxBits.py example which we discussed in class as the basis for your experiments. Recall that it uses a fitness function that sums the number of ones in the bit string. Update this program in the following ways:- The string length should be 100.
- The population size should be 100.
- Use an elite percentage of 0.05.
- Use single-point crossover with a rate of 0.7.
- Use a bitwise mutation rate of 0.001.
- Set the maximum generation to be 300.

**EXPERIMENT 2:**You will do a series of experiments for a problem, keeping the mutation rate and crossover rate fixed while varying the population size. Copy the program from the previous experiment and update it to use a fitness function that interprets the bit string as a binary number. Change the`isDone`method to stop when the largest possible binary number has been found. Then set the remaining paramenters as follows:- The string length should be 30.
- Start with a population size of 100.
- Use an elite percentage of 0.05.
- Use single-point crossover with a rate of 0.7.
- Use bitwise mutation with a rate of 0.001.

`xgraph GAAvgFitness GABestFitness`How do these plots change as you vary the population size? What can you conclude about how population size affects the speed of convergence of a GA?**EXPERIMENT 3:**Compare the GA's performance on the bit sum fitness function with that of the hill climbing algorithm given below:- Start with a single randomly generated string. Calculate its fitness.
- Randomly mutate one bit of the current string.
- If the fitness of the mutated string is equal to or higher than the fitness of the original string, keep the mutated string. Otherwise keep the original string.
- Go to step 2.

Iterate this algorithm for 10,000 steps. This is equal to the number of fitness function evaluations performed by a GA with a population size of 100 that is run for 100 generations. Plot the best fitness found so far at every 100 evaluation steps (equivalent to one GA generation), averaged over 10 runs. Compare this with a plot of the GA's best fitness found so far as a function of generation. Which algorithm finds better performing strategies? Which algorithm finds them faster? Comparisons like these are important if claims are to be made that a GA is a more effective search algorithm than other stochastic methods

**on a given problem**.