The robot will be placed no further than 24" away from an omnidirectional light source (a lightbulb almost on the same plane as the robot) and oriented within 60 degrees of the optimal path towards the source. The robot has 2 minutes to drive autonomously towards the light source and stop within a 6" radius zone of the source. Stopping is important! The robot will be considered within the home zone if any part of it is within the home zone (it does not have to be totally within the zone).
Constraints: vehicle must fit within a 6"x6" footprint. No height restrictions.
Grading points: (best of 3 trials)
Robot built - 10
Orients towards light - +10
Moves towards light - +10
Stops within 18" radius - +5
Stops within 12" radius - +5
Stops within 6" radius - +10
Doesn't fit into 6"x6" footprint - -5
There are several ways to interpret the output of a light sensor. One way is to make a direct proportional connection between the value of the sensor and the motor output power. Another way is to set a sensor value threshold, over which a motor response is triggered. Yet another way is to trigger a response when a sensor value falls below a threshold. Experiment to see which way works better.
In this part of the lab, we are interested in very simple controllers of the kind imagined by Valentino Braitenberg (see Readings). So make your program as simple as possible while still doing the specified job. In particular, you may only use the OnFwd() and OnRev() motor control commands.
Sensors are strange. They do not behave linearly everywhere, they may need to be reset (power cycle the robot) to work properly again. You will get 3 tries and you can switch off the light source between them if needed. Experiment with sensor placements for better readings.
First, coordinate with the second team and use both teams' robots, and add some battery-powered lights to your vehicles. We have provided you with 3 LED lights per 2 teams, and you may use these and/or other lights as you wish. Each vehicle must carry at least one source of light. Then, unleash your creative imaginations and design a set of two Braitenberg-style controllers (one for each robot) that together seem to an observer to be engaged in some "life-like" behavior describable in intentional terms. Intentional terms are those giving agency or intent to the vehicles, for example: like, dislike, be aggressive, be shy, pursue, attack, etc.
During the demo, the other 4-person team and the instructor will be guessing what intentional terms you were designing for. Telling the other 4-person team in advance what exactly you are working on is cheating in this part of the lab!
Grading points: (best of 3 trials)
Both robots carry lights and move - 10
Both respond to light in some consistent way for 2 minutes - +10
2-vehicle behavior can reasonably be described in intentional terms -
+10
Intentions guessed correctly by others - +10
Obviation of failure report - +10 (see Hand in section)
You may wish to redesign one or both of your vehicles for this part of the demo. The changes should be easily performed in a couple of minutes during demo time in lab. You may build on the controller from Part I, but beware preset thresholds for light sensors. The LED lights are much weaker than the omnidirectional source from Part I, and you might not be able to detect them properly with the same controller.
The ambient (overhead) lights can be turned on or off during the demo for Part II at your request.