Lab 7: Subsumption architecture
Due April 16

Genghis: One of the first robots controlled by subsumption architecture.

Starting point code

This lab may be done alone or with a partner of your choice. Go through the following steps to setup your directory for this lab.

  1. First you need to run setup63 to create a git repository for the lab. If you want to work alone do:
    setup63 labs/07 none
    If you want to work with a partner, then one of you needs to run the following while the other one waits until it finishes.
    setup63 labs/07 partnerUsername
    Once the script finishes, the other partner should run it on their account.

  2. For the next step only one partner should copy over the starting point code.
    cd ~/cs63/labs/07
    cp -r ~meeden/public/cs63/labs/07/* ./
    This will copy over the starting point files for this lab.

  3. Whether you are working alone or with a partner, you should now add all of the files to your git repo, commit, and push them as shown below.
    git add *
    git commit -m "lab7 start"
    git push

  4. If you are working with a partner, your partner can now pull the changes in.
    cd ~/cs63/labs/07
    git pull

Using pyrobot

Begin by familiarizing yourself with pyrobot, a python library for experimenting with both physical and simulated robots. Go through the pyrobot overview. Start by invoking pyrobot and experiment with it using the command line. (You can skip the sections on the camera, gripper, and simulation devices since you won't be using these for this lab.)

Once you have familiarized yourself with the interface via the command line, edit the file. Experiment with using this brain to control the robot. Modify the file in various ways, reload the brain in the robot, and retest it.

Building a subsumption brain

You will write a single subsumption-style brain that can find a light in the series of increasingly more difficult worlds shown below. These worlds are named:

The file contains an example of how to implement a form of the subsumption architecture. It creates a set of layered behaviors which are then added to a brain in priority order (from lowest to highest). For each time step, the highest priority behavior that is triggered will control the robot. To test this file on the LightBehindWall world do:

pyrobot -s PyrobotSimulator -w -r -b
Be sure you understand how this example works before moving on to creating your own subsumption brain. An illustration of the subsumption framework, taken from one of Rodney Brooks' early papers, is shown below.

You will create your own subsumption-style controller in the file You are welcome to use/modify the behaviors provided in the file. In order to solve this problem you will need to add behaviors that push the robot to explore the world more effectively. Some options you may want to consider are:

Be sure to comment each of your behaviors. Your subsumption program should be able to find the light in all of the worlds depicted above, even if it may take awhile.

More challenging worlds

There are four additional worlds called Challenge1-Challenge4, on which you can test your subsumption controller. If your controller can consistently solve these, well done! I will demo some of the best student controllers in class.

Submitting your code

To submit your code, you need to use git to add, commit, and push the the one file that you modified: