Robotics and Intelligence Systems >> Dynamic Autonomy >> Robot Implementation

Dynamic Autonomy

Robot Implementation

The robot used in this work is a modified ATRVJr robot platform commercially available from iRobot. The robot has a Sony CCD camera that can pan, tilt and zoom to provide visual feedback to the user. The robot also uses this camera in the autonomous modes to characterize the environment and even conduct object tracking. The INL has successfully interfaced a forward-looking infrared (FLIR) camera to an ATRVJr robot and has developed software that allows the data from this camera to be integrated into the robot control architecture. Fused data from the FLIR and CCD cameras will permit both autonomous and human-assisted recognition of relevant heat sources Photo: showing FLIRincluding fires and human heat signatures. The INL has developed behaviors that allow the ATRVJr to search for, track and intercept visual targets even at high speeds. These capabilities will be leveraged to meet perceptual challenges such as identifying an intruder and locating a fire. The robot is also equipped with an aluminum "hat" and a variety of serial and digital I/O ports which allow the robot to carry mission-specific sensors such as a Gamma Locating Device (GLD). For this system to meet its goals, we must indeed be able to guarantee that the robot will protect itself and the environment. To do so, we fuse a variety of range sensor information. A laser range finder is mounted on the front, and 17 sonar are located around the mid-section of the robot. The robot also has highly sensitive bump strips in the rear and front that register whether anything has been touched. To protect the top of the robot, especially the cameras and mission-specific sensors placed on top of the robot, we have added many infrared proximity sensors that indicate when an object is less than nine inches from the robot. Additional infrared proximity sensors have been placed on the bottom of the robot and point ahead of the robot toward the ground in order to prevent the robot from traveling into open space (e.g., traveling off of a landing down a stairway). Together, these sensors provide a nearly impervious field of protection around the robot and allow the operator to command the robot with full confidence that the robot will not damage itself or its environment.

Photo: INL robot

However, avoiding obstacles is not sufficient. Many adverse environments may include forms of uneven terrain such as rubble. The robot must be able to recognize and respond to these obstacles. It has inertial sensors that provide acceleration data in three dimensions. This data is fused with information from the wheel encoders on the actual velocity and acceleration of the wheels, and current draw from the batteries, to produce a measure of the "unexpected" resistance encountered by the robot. The user can choose to set a resistance limit that will automatically stop the robot once the specified threshold has been exceeded. The resistance limit is invaluable not only for rough terrain, but also in situations when the user needs to override the "safe motion" capabilities (based on the obstacle avoidance sensors) to do things like push chairs and boxes out of the way and push doors open. In addition, the robot has tilt sensors that indicate pitch and roll.

« Prev   Next »

Page Contact Information:

Department of energy

DOE Office of Nuclear Energy
DOE-Idaho Office