Wednesday, April 01, 2009

Building a Robot, a Two-Year Project

When building a robot it is helpful to ask yourself what you want it to do. For the first few robots I built, that question was either an afterthought or the answer so obvious that the question was never asked consciously (it was most likely the latter). All I wanted out of my first robot was motion, which turned out to be mainly a mechanical problem. Mechanical engineering is not exactly my forte—for example, when I need a wheel, a film canister lid with a rubber band around it will suffice nicely. I utilized hot melt glue extensively in my first designs. Needless to say, these projects were purely indoor affairs. Once the mechanical platform was constructed (begrudgingly), it was then just a matter of writing code to enable motion, collect sensor input, and process sensor input to make decisions that affect motion.

The first robot I built was a simple beast indeed. The whole robot consisted of two gear-motors ripped out of a camcorder focusing mechanism, two cad cell light sensors (a cad cell is a sensor made out of cadmium sulfide which has a resistance inversely proportional to the amount of light striking its surface), and a Basic Stamp 2 micro-controller. Never mind what all of this was built on and what was holding it together; it looked like a glue ball with wires sticking out all over the place. My roommate was playing Herbie Hancock on his stereo during its development, thus the muse for the moniker. It is easy to anthropomorphize a robot once it has a set of simple behaviors. Herbie was a he; he was attracted to light and wanted to find the very brightest source of light and bask in it. His most basic behavior, photophilia, is also shared by moths who senselessly bash themselves into light bulbs subsequently resulting in their demise by blunt trauma combined with hyperthermia, but I digress. The point of the moth analogy is that my first robot was essentially a floor moth endowed with significantly fewer survival skills and no ability to reproduce or sustain itself.

The Herbie platform underwent several revisions which included the addition of a sonar sensor, a third cad cell, and a solar cell array. The sonar sensor gave it the ability to see obstacles and precipices and his software was further developed to enable avoidance of these hazards, thus making progress on the survival skills front. The addition of the cad cell and solar array allowed him to sense a threshold light intensity at which he could charge his batteries, and his software was revised to enable him to stop and recharge in bright light. This effectively solved the sustenance problem; however, robot reproduction is a problem that I don't think I'll begin work on until late in my robotics career.

The discussion of Herbie provides background into my robot construction philosophy and an opening to the presentation of a project that is several orders of magnitude more complex: the Robo-Magellan. I christened my Robo-Magellan Sputnik. I know, Sputnik was a Russian space satellite – a program that consisted of forty-one Russian space satellites to be more correct – but I am using the name anyway, mainly because I like it. Sputnik is in a class of robots designed to compete in the Robo-Magellan competition. As you can probably infer from its name, the Robo-Magellan is a competition in which robots navigate. The robots must navigate autonomously, be no heavier than fifty pounds, fit within a four foot cubed box, and not cause damage to their environment. The object of the competition is twofold: minimize the time required to complete the course and contact target way-points. The actual expression for computing the score is slightly complex, but it is readily apparent that race time is inversely proportional to the number of way-points contacted. With this challenge in mind, the answer to the question “what do I want my robot to do?” is simple and fairly complex at the same time. The simple answer: I want my robot to navigate between a series of predefined way-points specified by a set of given GPS coordinates marked by eighteen-inch orange plastic traffic cones, touching each cone that it navigates to while avoiding all other obstacles in its path. On the other hand, the complicated and consequently truncated answer sounds something like this: I want my robot to calculate a vector consisting of direction and distance for each possible way-point pair, optimize a route to contact only the cones that will yield the the optimum race time to acquired way-point ratio, use odometery to measure distance and an electronic compass to measure direction, check for sensor drift with data from a GPS module and correct for high frequency noise in the GPS data with an inertial navigation system, and use machine vision to identify orange traffic cones.

My approach initially was to endow Sputnik with an awareness of its immediate surroundings. Starting with a hacked remote control truck and three sonar range finding sensors, Sputnik was able to bounce around an area in two dimensions like a fly bounces around a room in three. Notice I did not have to employ my mechanical engineering skills in devising this robot's platform. The controller responsible for rudimentary object detection and avoidance I refer to as the platform controller. The platform controller is a subsystem that understands how to drive the robot and detect obstacles. Next I added the navigation controller. The navigation controller is attached to a GPS module, an electronic compass, and a rotary encoder (odometer). The navigation controller stores data from all three sensors for retrieval by the system controller. The system controller is actually a fully-fledged computer. It runs Windows XP and has, among other things, two USB cameras attached to it. The cameras are used by the system controller to look for cones. Although each subsystem has some autonomy and can function in isolation, the system controller is the ultimate decision maker. Its purpose is to request for and process sensor data and make complex decisions based on all of the information available to it.

The concept of using a variety of sensors and giving a weight or degree of credibility to the data from a sensor based on changing circumstances is called sensor fusion. At present the system controller software does a rudimentary job of analyzing sensor data to form a model of its overall situation. In reality, the code is buggy and moderately functional, but when the code is mature enough I will venture to say it is performing sensor fusion. I estimate Sputnik will be competitive for the 2009 Robothon in Seattle and believe that a two-year schedule is appropriate for anyone attempting to build a similar robot. Robotics is an amalgam of electronic, electrical, mechanical, and software engineering with each of them playing an equally important role. Although many robotics applications are complex the successful ones usually embrace modularity and simplicity whenever possible. By using Occam's Razor, the corollary that whatever can fail will fail, you can identify unnecessary complexity (weakness) and cut it out of the design. I am constantly tempted to go wild with my manifold sensor data, and suspect that I will after I win this competition, building inordinately complex simulations, aggressive pathing algorithms, and advanced multidimensional maps of discovered terrain. Meanwhile, in order to constrain my programming focus to only concentrate on what needs to be done to compete effectively, I continually ask myself, “What do I want my robot to do?”

0 Comments:

Post a Comment

<< Home