Lab 2: Traveling Salesperson
Due Feb. 8 by midnight

Problem: Given latitude and longitude for a set of cities, find the shortest round-trip tour that visits every city.

Starting point code

You are encouraged to work with a partner on this lab. Please read the following instructions carefully.

  1. First you need to run setup63 to create a git repository for the lab.

    If you want to work alone do:

    setup63-Lisa labs/02 none
    If you want to work with a partner, then one of you needs to run the following command while the other one waits until it finishes:
    setup63-Lisa labs/02 partnerUsername
    Once the script finishes, the other partner should run it on their account.

  2. For the next step, only one partner should copy over the starting point code:
    cd ~/cs63/labs/02
    cp -r /home/bryce/public/cs63/labs/02/* .

  3. Whether you are working alone or with a partner, you should now add, commit, and push these files as shown below:
    git add *
    git commit -m "lab2 start"
    git push

  4. If you are working with a partner, your partner can now pull the changes in:
    cd ~/cs63/labs/02
    git pull
You are now ready to start the lab.


The objectives of this lab are to:

You will implement the following local search methods:

You will need to modify the following python files:

Traveling Salesperson Problem

A traveling salesperson tour visits each of a collection of cities once before returning to the starting point. The objective is to minimize the total distance traveled. We will represent a tour by listing cities in the order visited.

The class TSP in the file represents a problem instance and provides several useful methods. TSP.random_tour() returns a random state from which we can begin a local search. TSP.cost(tour) evaluates the total distance traveled to complete a tour. TSP.get_neighbor(tour) calls functions you will write to return a new tour in which two cities have swapped positions.

A TSP instance is constructed by passing in the name of a JSON file specifying the latitude and longitude of locations that must be visited. Example JSON files are located in the coordinates/ folder. The __init__() method also takes a string specifying how to select a neighboring state: passing "best" or "random" will determine what gets called by the TSP.get_neighbor() method.

The TSP class leaves three methods for you to implement:

Neighbors of a tour are produced by swaping the positions of any two cities in the ordered list.

You are strongly encouraged to write a main function for in order to test your implementation of the neighbor functins.

Hill Climbing

The program is already set up to read in a file with latitude and longitude for various cities, as well as a configuration file. The main function initializes a TSP object, calls hill_climbimg(), and then outputs the resulting tour, its cost, and a plot (map.pdf by default). Your task is to implement hill_climbimg(). To get started, replace the NotImplementedError with the following lines:
tour = tsp.random_tour()
cost = tsp.cost(tour)
return tour, cost
This will allow you to run without generating errors:
./ coordinates/United_States_25.json
Doing so will print the random tour and its cost, and will also create a file that plots the tour in latitude/longitude coordinates. By default, the plot is saved in the file map.pdf, but you can change this with the -plot command line argument:
./ coordinates/India_15.json -plot random_tour.pdf
Your job is to implement the following hill climbing algorithm:
state = random tour
best = state
while evals < MAX_EVALS:
    if random restart:
        state = random tour
    else if random move:
        state = random neighbor of state
        neighbor = generate (best or random) neighbor of state
        if cost(neighbor) < cost(state):
            state = neighbor
    if cost(state) < cost(best):
        best = state
end while
return best, cost(best)
Note that because the TSP class is counting the number of times cost() gets called, you should save the value rather than calling it again when possible. Several of the parameters used above are specified in a config file. By default, reads HC_defaults.json. You should copy this file and modify it, passing in the new file via the -config command line argument:
cp HC_defaults.json HC_config.json
*edit HC_config.json*
./ coordinates/South_Africa_10.json -config HC_config.json

Simulated Annealing

The program is set up much like; the main function and input/output formats should be familiar. Your task is to implement the algorithm for simulated annealing described in section of Poole and Macworth.

The config file for simulated annealing includes a slightly different set of parameters. In particular, config["INIT_TEMP"] specifies the starting temperature, and config["TEMP_DECAY"] gives the geometric rate at which the temperature decays. The RUNS and STEPS parameters specify how many random starting points simulated annealing should be run from, and how long each run should last.

Beam Search

Your final implementation task is to write stochastic beam search as outlined in section 4.9 of Poole and Mackworth. Beam search uses the same temperature and steps parameters as simulated annealing. However, note that the algorithm uses the temperature slightly differently. Additionally, instead of starting several separate runs, beam search mantains a population of candidates at each step; the POPULATION parameter specifies the number of candidates.

Stochastic beam search considers all neighbors of all individual candidates in the population and preserves each with probability proportional to its cost. It is recommended that you add a helper function for beam search that evaluates the cost of each neighbor and determines its probability of selection based on the current temperature, then randomly selects the neighbors that will appear in the next iteration. For the larger maps, the number of neighbors grows quite large, so it may be impractical to evaluate the cost of every neighbor. Your helper function should therefore consider only a random subset of each individual's neighbors. The size of this subset is given by the MAX_NEIGHBORS parameter in the beam search config file.

A very useful python function for stochastic beam search is numpy.random.multinomial. This function takes a list representating a discrete probability distribution and a number of samples. It returns a list representing the number of times each element was drawn over those ranom samples. For example:

>>>from numpy.random import multinomial
>>>for i in range(5):
>>>    print multinomial(10, [.4, .2, .1, .3])
[6 2 0 2]
[6 1 0 3]
[5 2 1 2]
[4 1 1 4]
[5 2 2 1]

Exploring the Space of Parameters

You should test each algorithm for correctness as you implement it, but when you have completed all three, you can compare performance. For each algorithm (and for both hill climbing neighbor selection methods), you should try out a number of parameter settings and track the best results as well as the parameters that found them. In the file best_params.txt, you should record the results of this exploration. You should pick one problem to test all algorithms on, and also try all problems in the coordinates/ directory with at least one algorithm.

You are encouraged to try automating this exploration process. Parameter tuning for new algorithms is in fact one of the most common uses of local search! However, if you plan to run long jobs on lab machines please be courteous to other users. Monitor your memory usage with the program htop, and make sure that python isn't hogging all of a lab machine's memory unless you are sitting at that machine. When running long jobs on a shared machine, it also helps to use the nice command to give other users prioity, but nice does not absolve you from the need to monitor your memory footprint.

Submitting your code

Before submitting, ensure that you have added your name(s) to the top-level comments. To submit your code, you need to use git to add, commit, and push the files that you modified. Only,,,, and best_params.txt will be graded; other files added to your repository will be ignored.
cd ~/cs63/labs/02
git add best_params.txt
git commit
git push