### Cogs1 Spring 2007 Lab 2: Creating a brain for a simulated robot Due by 11:30pm on Wednesday, February 14

#### Introduction

We will be using simulated robots throughout the semester to better understand the embodied approach to cognitive science. We will be using a software tool called pyrobot, which stands for python robotics. The goal of this tool is to provide a programming environment for easily exploring ideas in cognitive science and artificial intelligence without having to worry about the low-level details of the underlying hardware of particular robots.

This week we will learn how to create robot controllers that have a close coupling between perception and action. This means that there is limited state information and computation. This is called direct or reactive control.

#### Effectors

An effector is a part of a robot that can change the state of the world. For example, motors attached to wheels can cause a robot to move, or a gripper can pick things up and move them. The basic effectors assumed in our experiments will be motors. We will also assume that each robot is capable of only two kinds of motion:

• Translation: Going forward or backward
• Rotation: Spinning clockwise or counter-clockwise
It is also possible to blend these movements so that a robot can move forward or backward while turning. The command to make a robot move is:
```self.robot.move(translation, rotation)
```
where translation is a floating point number between -1 and 1, and rotation is also a floating point number between -1 and 1. Negative translation values equate with backward motion; positive translation values equate with forward motion. Negative rotation values equate with right turns; positive rotation values equate with left turns. Here are some example movements:
• self.robot.move(1,0), forward
• self.robot.move(-1,0), backward
• self.robot.move(0,0), stop
• self.robot.move(0,-1), right turn in place
• self.robot.move(0.5, 0.5), sweeping left turn
• self.robot.move(0.1, -0.3), slow right turn

#### Sensors

Each robot comes equipped with a collection of sensors. The simulated robots we'll be using may have sonar sensors, light sensors, stall sensors, and cameras. Sonar sensors are active range finders that can be used to sense distances to obstacles. Light sensors report the amount of visible light. Stall sensors indicate whether the robot is unable to move. Cameras provide two-dimensional images.

This week we'll focus on learning how to use the sonar sensors. The robot comes equipped with a ring of 16 sonar sensors, numbered from 0 to 15. Sonar number 0 is positioned at the front of the left side of the robot. The remaining sonars are counted in order from here in a clockwise direction. The command to view the current value of a particular sonar sensor is:

```self.robot.range[number].distance()
```
where number is between 0 and 15. This command will return a value that is measured in what is known as robot units which are equivalent to the diameter of the robot. For example, if the left side of the robot were very close to the wall then, executing this command for sonar number 0 or 1 would return a very small value, less than 0.5, meaning that the robot is less than half of its diameter from the wall.

#### Getting Started

• To get copies of the lab 2 files, in the terminal window type: update-cogs1
• Then to begin simulating robots, in the terminal window type: pyrobot and then do the following:

1. In the pyrobot window, press the button labeled Server
2. Select PyrobotSimulator and press OK
3. Scroll down through the various worlds and select Tutorial.py and press OK. This should open a new window containing a red robot in a simple world.
4. In the pyrobot window, select Robot
5. Select PyrobotRobot60000.py and press OK. This should connect to the red robot in the other window and it will now have lines coming out of it representing its sonar sensors.
6. In the pyrobot window, select Brain
7. Scroll down through the various brains and select Joystick.py and press OK.
8. In the pyrobot window, select Run. The brain is now active and ready to control the robot's movements. For the joystick-based brain, use the mouse to drag from the dot in the center towards a particular movement.
9. In the pyrobot window, press Stop to deactivate the chosen brain.
10. Let's try selecting one of the brains that I provided. In the pyrobot window, select Brain. Then press the Home button in the upper-right corner and choose the appropriate directories to get to the lab 2 files. Then select sensing.py. Select Run to see this brain in action.

• To end a session of simulating robots, go to the File menu and select Exit. This should close all of the windows associated with pyrobot. If some windows remain, open another terminal and type: endpyrobot. This will force all of the windows to close.
• You can see the actual program associated with a brain by clicking on the brain's name in the pyrobot window. This will open an editor window. If the brain is from your own directory, then you are free to modify it. Try adding print statements, such as:
```print "my message"
```
Save the change by choosing File and then Save. Then in the pyrobot window, select Reload brain. Now when you press Run the new version of the brain will execute and you should see information printing in the pyrobot window.

#### What to turn in for this lab

1. In your cogs1/labs/2 directory there are several example brains. Experiment with the brains avoid.py and wallFollow.py and answer the questions posed in the file called explanations. You can edit this file in idle.