Cogs1 Spring 2007
Lab 2: Creating a brain for a simulated robot
Due by 11:30pm on Wednesday, February 14


Introduction

We will be using simulated robots throughout the semester to better understand the embodied approach to cognitive science. We will be using a software tool called pyrobot, which stands for python robotics. The goal of this tool is to provide a programming environment for easily exploring ideas in cognitive science and artificial intelligence without having to worry about the low-level details of the underlying hardware of particular robots.

This week we will learn how to create robot controllers that have a close coupling between perception and action. This means that there is limited state information and computation. This is called direct or reactive control.


Effectors

An effector is a part of a robot that can change the state of the world. For example, motors attached to wheels can cause a robot to move, or a gripper can pick things up and move them. The basic effectors assumed in our experiments will be motors. We will also assume that each robot is capable of only two kinds of motion:

It is also possible to blend these movements so that a robot can move forward or backward while turning. The command to make a robot move is:
self.robot.move(translation, rotation)
where translation is a floating point number between -1 and 1, and rotation is also a floating point number between -1 and 1. Negative translation values equate with backward motion; positive translation values equate with forward motion. Negative rotation values equate with right turns; positive rotation values equate with left turns. Here are some example movements:


Sensors

Each robot comes equipped with a collection of sensors. The simulated robots we'll be using may have sonar sensors, light sensors, stall sensors, and cameras. Sonar sensors are active range finders that can be used to sense distances to obstacles. Light sensors report the amount of visible light. Stall sensors indicate whether the robot is unable to move. Cameras provide two-dimensional images.

This week we'll focus on learning how to use the sonar sensors. The robot comes equipped with a ring of 16 sonar sensors, numbered from 0 to 15. Sonar number 0 is positioned at the front of the left side of the robot. The remaining sonars are counted in order from here in a clockwise direction. The command to view the current value of a particular sonar sensor is:

self.robot.range[number].distance()
where number is between 0 and 15. This command will return a value that is measured in what is known as robot units which are equivalent to the diameter of the robot. For example, if the left side of the robot were very close to the wall then, executing this command for sonar number 0 or 1 would return a very small value, less than 0.5, meaning that the robot is less than half of its diameter from the wall.


Getting Started


What to turn in for this lab

  1. In your cogs1/labs/2 directory there are several example brains. Experiment with the brains avoid.py and wallFollow.py and answer the questions posed in the file called explanations. You can edit this file in idle.
  2. Use handin-cogs1 to turn in your answers.