Unmanned aerial vehicle

Small robotic airplanes carrying sophisticated visual sensors can help scientists quickly and efficiently monitor the health of large ecosystems.

INL's unmanned aircraft survey the landscape with a focus on the environment

By Sandra Chung, INL Research Communications Fellow

A little airplane about the size and speed of a large bird buzzes several hundred feet above the scrubby sagebrush landscape near the Idaho National Laboratory test airstrip. There's no human pilot onboard, just a tiny computer autopilot, some sensors and cameras, and a few pounds of fuel.

Hyperspectral sensor
The hyperspectral sensor collects much more detailed visual information than a standard digital camera.
These Arcturus unmanned aerial vehicles, or UAVs, are much more sophisticated than your typical remote-controlled plane. INL robotics and remote sensing experts have added state-of-the-art imaging and wireless technology to the UAVs to create intelligent remote surveillance craft that can rapidly survey a wide area for damage and track down security threats. But these robot planes aren't just for security anymore.

The planes' latest mission: to carry Resonon's compact Pika sensors high above the sagebrush steppes of eastern Idaho. Scientists can use the super-sensitive imaging tools to rapidly "read" the health of the environment on a much larger scale than is practical with ground-based sensors. The sensor information could help fight and prevent wildfires, preserve important ecosystems, and make cropland healthier and more productive. And by adapting the sensors to work on small, sturdy, computer-controlled planes, INL researchers have made airborne hyperspectral sensor technology significantly cheaper and easier to deploy on demand.

"It's a perfect fit," says Ryan Hruska, an INL remote sensing expert who calibrates the Pika sensors and develops flight plans for imaging missions. For the UAV and remote sensing team at INL, adding a hyperspectral sensor to their repertoire was a logical next step. "We've done several UAV imaging projects before with different kinds of cameras and sensors. The hyperspectral sensor really increases what we can see."

Since the 1970s, scientists have used high-flying cameras and sensors to map the Earth's surface and study earth, water and air. Hyperspectral sensors like the Pikas are more sensitive to color than a digital camera and can see things that a normal camera or human eye can't. The sensors reveal clues about the condition of vegetation and wildlife, and the presence of nutrients and pollutants. Planes fitted with such sensors are being used to track the devastating North American mountain pine beetle infestation that has contributed to more frequent encounters between grizzly bears and humans. And for much of 2010, NASA's AVIRIS hyperspectral sensor has been busy mapping and analyzing the Gulf of Mexico oil spill.

But AVIRIS is often booked more than a year in advance. It's expensive to deploy, and its tight schedule means that bad weather or technical hiccups can delay a mission for months or even years. Arcturus UAVs, however, can fly for longer on less fuel than a bigger plane with a human crew on board. The bird-size UAVs are also much cheaper to operate. And unlike the tightly booked AVIRIS system, UAVs are available on demand to fly for long periods of time — as long as 16 hours.

Better than a bird's-eye view

Arcturus T-16s more or less fly themselves. The onboard computer autopilot does most of the work of coordinating take-off, flying and landing; all the human pilot has to do is program in a flight path and monitor the plane in the air.

Researcher in the field
Hyperspectral information from the UAV can be used to study the health of plants and ecosystems on a wide scale.
As the T-16 flies in a straight line path, the hyperspectral sensor on board captures one row of pixels at a time, like a scanner scanning a photo. But the sensor delivers many times more information about each pixel than a photograph would. While a digital camera's color sensor detects only three colors – red, green and blue – the Pika sensor on the plane detects light at 80 different wavelengths spread throughout the visible and infrared parts of the spectrum.

Scientists turn the sensor data into a graph that shows the intensity of the light at each wavelength. The more wavelengths the sensor splits the spectrum into, the higher the spectral resolution, and the more finely tuned this graph will be. Different materials absorb and emit different amounts of light at each wavelength, yielding a characteristic spectral signature. Scientists know what the spectral signatures of healthy plants, soil and oil look like. They can recognize signs in the graph that highlight plant stress and oil pollution.

The UAV team took the T-16 out for several test flights in 2009 and 2010 to calibrate the hyperspectral sensor and to find a combination of speed and altitude that maximizes the quality of the sensor data. Flying higher means coarser spatial resolution, but it also reduces the amount of wiggle introduced into the image when the plane inevitably jostles about in the wind.

The state-of-the-art hyperspectral sensors mounted on satellites and in manned aircraft are larger and have up to 30 times the spectral resolution of the Pika sensors mounted on T-16s. But satellites and manned planes also produce coarser images. Everything within an area the size of a basketball court might be represented by a single pixel on a satellite image. The T-16-mounted sensor, on the other hand, can pick out a single basketball on the court. This relatively high spatial resolution could help scientists swiftly analyze entire vegetation systems at the level of individual plants – an unprecedented combination of detail and speed.

Where the deer and the antelope graze

Jessica Mitchell, an Idaho State University graduate student, wants to know if hyperspectral information from the T-16-mounted sensor can reveal the health of sagebrush on the INL site in eastern Idaho. The site is a refuge for many rare animals like sage grouse, pygmy rabbits and pronghorn antelope that depend on sagebrush for food and shelter. The animals go after the highest quality sagebrush, and knowing where the healthiest plants are can help scientists predict grazing patterns and direct future development to places where it will have a minimal impact on wildlife.

Arcturus UAV
Once they're in the air, Arcturus UAVs can follow a programmed flight path without help from a human pilot.
Analyzing sagebrush health currently entails studying pieces of individual shrubs, or hiking through the brush with a handheld sensor. Both methods are too time-consuming to be practical over the nearly 900 square miles of sagebrush steppe at the INL site. Airborne hyperspectral imaging equipment can cover the same ground much faster, and has already been used to assess the health of dense forests. But sagebrush steppe contains a lot of bare ground, which overwhelms light signals before they reach airborne sensors.

Mitchell and her ISU colleagues are developing image processing methods that will help them analyze the sensor data and compensate for the distorting effects of bare ground. They've also identified some wavelengths that might be useful in assessing the nutrient levels in the sagebrush leaves. Mitchell is comparing sagebrush data from ground-based methods to data from the T-16-mounted sensor and a commercial manned hyperspectral imaging flight.

So far, she and her colleagues have shown that the T-16 hyperspectral sensor agrees pretty closely with the ground-based sensor – which Hruska says is the "gold standard" for hyperspectral imaging. In addition, combining the hyperspectral data with recent LiDAR scans will add information about the 3-D contours of the terrain. The combined data allow scientists to make finer distinctions between different types of vegetation and quantify the total amount of plant matter – a handy measure of how productive the land is.
Meanwhile, the Federal Aviation Administration is in the process of developing new regulations for operating unmanned aircraft. There are currently no radar traffic control systems for such craft, so the plane's human crew has to stay on the ground and keep the plane within sight at all times. Anderson is working with the FAA to develop and integrate technology that will automatically sense obstacles and guide unmanned aircraft around them. Once that technology and the appropriate regulations are in place, they could significantly extend the practical range and safety of UAV flights.

But INL's UAV and sensor team won't sit idle while the FAA finalizes its new regulations. The team is currently rounding up collaborators and funding to help them test the UAV-mounted hyperspectral sensor on another, completely different patch of vegetation far from the INL sagebrush steppe. As INL robotics expert Matthew Anderson says, "It's time to leave the nest and see if we can get this thing into some different skies besides our own."

Read the UAV slideshow transcript.

Printer Friendly
Please note that it is the responsibility of the individual making information available on this website to ensure that any applicable reviews/approvals have been obtained from the originating organization.
Copyright © 2011 Idaho National Laboratory