2. Sensor Analysis

The sensor analysis addresses the question: Given the constraints of the SRR what sensors are needed to perform the tasks needed for the SRR? Again, the additional question is: Can I get and make effective use of those sensors.

Sensor Requirements

A more basic question is what is the purpose of the sensors? Without a more detailed analysis we do not know. But based on experience and some reflection on the overall problem we can outline a few fundamental requirements.

    1. Is there a sample nearby?

    2. Where is the rover located?

    3. Which direction is the rover facing, i.e. its bearing?

    4. Is the rover approaching a landmark, starting platform, obstacle, sample, orange fence, etc?

    5. Is the location stable, i.e. is the rover going to tumble?

The list is surely incomplete but we can survey the available sensors and determine where they may be useful, and, can I make use of them.

Conventional Sensors

There are a multitude of sensors available for use on robots today. Some of these are not allowed on the SRR rovers since they would not work on another planetary body. The challenge directly prohibits the use of GPS or compass to determine location and bearing. It indirectly prohibits other sensors, such as sonar, since the atmosphere on Mars is very low pressure and on the Moon non-existent.

Two sensors often used in robots, especially those outdoors, are accelerometers and gyroscopes. The accelerometer is a clear choice for detecting when the robot might tumble. (Mystic 2 in '13 rolled over because of a small slump on the side of a hill.) Unfortunately, it is more challenging to use for determining speed or direction as one would hope.

Similarly, one would expect a gyroscope to be useful for determining changes in bearing but so far my experience is not favorable. I will continue experimenting with them. Also, I have used inexpensive devices and it may be that more expensive may be usable.

Accelerometers and gyroscopes in combination are sometimes referred to as an Inertial Measurement Unit (IMU), although many IMUs also include a compass. The Phidgets device that I have contains all three sensors. For the competition, I would need to satisfy the judges that the compass is not being used.

Some of the entrants have used lidar (a portmanteau of "light" and "radar) which uses a laser beam to measure the distance to objects. These tend to be expensive and, since they actually use infra-red, may be problematic in the bright sun. Entrants who used them have mixed comments, some finding them useful and others having sunlight problems.

There is one hobby version from Miremadi that is a less expensive possibility. The designer indicates it works well in sunlight as long as the sun is not shining directly into the sensor. A lidar would be useful for obstacle detection.

A similar device is a Sharp IR sensor such as the GP2Y0A02. Again, since these use infra-red they are problematic outdoors. Teams that used them found them difficult to use for collision detection. They may be useful for other proximity measurements.

Wheel encoders are another typical sensor. The Dagu Wild Thumper chassis I am using does not have these on the motors but they are available special order, meaning costs a lot, from China. Encoders are problematic because wheels can slip, and often do outdoors, which makes the results inaccurate. I just have an aversion to them because of the potential problems, especially with a skid-steer system.

Vision is clearly one sensor that must be used in the challenge. I do not have a great understanding of vision processing techniques. It may be hubris on my part to think I can address the problems that many researchers with greater experience are attempting to solve.

Oddball Sensors

The sensors above you can find mentioned in any good book on mobile robots. Here are some other possible sensors that might be constructed for the SRR.

Forrest M. Mims III developed a LED Sun Photometer that lead me to think about using the sun to determine the bearing of the robot. Since the robot has a clock it would be easy to determine the azimuth of the sun and from this calculate the bearing of the robot. I performed some experiments with this but time prevented refining this into a useful device. One team in 2012 attempted to use a similar device but reported that nearby buildings and trees in the park limited its usefulness.

An alternative approach is to use the camera to detect the sun by locating the brightest area in the sky.

Obviously, both these approaches are challenged by trees, buildings, and clouds. The use of UV LEDs, or photodiodes, might eliminate the clouds as a challenge.

I also considered using the radio signal generated by the sun. The sun outputs a tremendous radio-frequency (RF) signal. Much of this is blocked by the atmosphere but frequencies around 2.5 GHz, the same as WiFi, pass through. One difficulty I encountered was finding a receiver design. I also determined that a directional antenna sufficient to locate the sun's position would be larger than my rovers could manage.

Another idea, somewhat related to vision, is the use of an optical mouse with a lens that focuses on the ground. The mouse measures the movement of the ground under the rover. This provides a measurement of the rover's movement.

The rovers need to communicate with one another using WiFi. The home beacon has a WiFi router to enable this information sharing. The WiFi signal strength from the router is proportional to the distance to the rover. It might also be possible to measure the bearing to the router. This information combined might provide information about the location of the rover. Possible problems are reflections from buildings, other interfering WiFi signals, and insufficient correlation between the signal strength and distance.

Conclusion

The answer to the main questions are still open to a large extent. It is clear that vision processing is critical to meeting the challenge. Almost all the requirements listed above involve vision processing. There are a couple that can be addressed with a reliable IMU but even they benefit from vision processing.

Can I use these sensors? For most of them, the answer is I can. As for vision processing, I will accept the challenge of learning it.