Some applications of a motion detecting camera in remote environments

Pablo Bryant, San Diego State University, Field Station Programs
Hans-Werner Braun, University of California San Diego, HPWREN

February 2003

Background

Various events, such as animal movement through corridors between island habitats, can be supported by inexpensive camera and wireless networking technology available today. An example utilizes an IQeye3 camera, which includes a 1288x968 pixel imager, coupled with an Ethernet-accessible web server in a single enclosure. The device consumes less than 3 Watts at 12V DC input. Combined with a low power radio, such as a Wi-LAN VIP 110-24, it is possible to power the radio and camera combination via a single solar array panel, with a single battery for power storage in areas with plenty of sunshine (such as San Diego County).

Images of the initial installation can be found at http://hpwren.ucsd.edu/Photos/20030111/.

Results of other camera alternatives can be found at http://stat.hpwren.ucsd.edu/Imagery/, specifically based on Ricoh 3+ megapixel i700 cameras, and more generally at http://stat.hpwren.ucsd.edu/cameras/. While the IQeye3 camera has less resolution than the Ricoh i700, it is a more completely implemented network camera with, for our purposes, richer functionalities, among others an ability to define parameterizable image motion detect areas, an exchangeable lens, and easy interface for automated image collection via the network, and a syslog capability.

For the initial setup we utilized an existing 2.4GHz radio node, adding the camera in a water-tight enclosure, powered via its power-over-Ethernet capability.

Initial camera results

For the initial tests a single motion detect area was defined at the bottom of a stair way, to detect movement in that part of the image.

The motion detect area was parameterized so a significant fraction of the pixels had to change to trigger an event, to avoid the vegetation moving in the wind from images being taken. In addition the camera was set up to not only collect the image at the time of the event, but to add one pre-trigger and one post-trigger frame. This allows insight in what caused an event, and how it is evolving, an example being to see from where to where the movement is happening. The example IQeye3 images used in this summary are arranged as a triplet, with the trigger event being displayed in the center image.

The following three triplets are from the initial test, with the motion detect area nearly 100 feet away from the camera.

In the above, the first series shows a person passing through the motion detect area, the second series is a result of a goose running on the ground, and the third was triggered by a bird flying.

A subsequent consideration was to track events at night, based on a passive infrared detector. A $20 PIR sensor device and floodlight was added to the bottom of the stairs, and set to a very short light duration.

As can be seen, the appearance of an infrared radiating body set off the PIR sensor, which turned on the floodlight, which cased enough pixel changes in the motion detect area to trigger the camera. This despite the camera and the PIR/floodlight combination being totally independent systems.

To better track animals close to the PIR sensor, the camera was moved right above the sensor, and pointed directly at the area that the floodlight illuminates.

This allowed tracking animals as small as a little rabbit, as seen in the next image triplet.

The first image barely shows the rabbit illuminated in the moon light, as it jumps into the image. The second image fully illuminates the animal, while in the third image motion blur in the front part of the body is visible, as it gets read to jump out of the image area.

Obviously, it is even easier to track larger animals, like a dog:

Second application, a sensor pod at Superbowl 2003 in San Diego

As part of ongoing investigations into new ways to conduct research at field stations, we have been prototyping a more integrated and portable wireless sensor platform that includes a flexible power system and is built around a low power Wi-LAN VIP110-24 2.4GHz Ethernet radio.

This pod had to be small enough that it could be stowed in the back of a car and set up quickly by one person. In addition, the pod has to be light-weight enough so one or two researchers can move it into the back country and deploy it for the collection and real-time transmission of data via a high speed wireless link. Distances of over 20 miles line of sight have been attained with a throughput of over 7 megabits per second with the radios used.

HPWREN has been testing low power radios that can be used at solar powered radio repeater sites within the HPWREN network. The Wi-LAN VIP110-24 radio draws only 5 watts or around 400 mA at 12 V DC, making long-term deployment with a small-seized solar power plant feasible.

This configuration was run for an extended period of time, 24 hours per day, with no down time related to the setup, even after an IQeye3 camera was added. The single array/battery configuration was able to support the IQeye3 camera in addition to the radio, using a 60 Watt solar array and an 100 Ah battery.

For a more integrated solution, the pod was built out of 8" diameter PVC and is about 40" long with convex end caps. The tripod base is made from 3' sections of 1/2" steel pipe which thread into three steel flanges that are affixed to the pod's base. The bottom third of the pod is comprised of the battery bay that contains two 25Ah 12 V DC sealed batteries. The middle section of the pod houses a Campbell Scientific CR510 datalogger alongside a 4 Amperes battery charge controller. Modular plugs were added to the input side of the controller for the solar panels, so one or more panels can be easily added and removed as the application dictates. The upper section of the pod houses the camera and the radio. Bulkhead N-connectors on the outside of the pod to the radio allow for easy connection of antenna coaxial cable. The top houses a detachable steel mast that serves as a mount for a 20 watt solar panel, planar array antenna and weather instrumentation.

The left of the two images above shows the sensor pod located on the roof near the Superbowl event, with built-in environmental sensors, and the camera pointed to the Superbowl stadium. The right image includes the wireless relay atop the actual stadium rim, which supported the wireless connection during the Superbowl event between the sensor pod and the network base station at the San Diego State University.

Four motion detect areas were defined in the camera for the Superbowl event, one in the center, two on the stadium rim, and one in the air-space above the stadium.

The air-space area was added after the Superbowl event started, and when it became visible that fireworks were good motion detect targets, giving us an excellent opportunity to experiment with such an environment.

The image series above predates the Superbowl event itself, and shows a trolley passing through the motion detect area in approximately the center of the image.

These four series of images show the results of the fireworks. Besides the aerial explosions of the trigger events themselves, it is possible to see what predates the explosion(s), and how they are evolving.

Conclusion

This summary only shows some initial application of a motion-detect camera connected via a wireless high-performance data networking environment, enabled by the Internet interoperability of the camera chosen. While there are many more application opportunities, a next step is to more operationally deploy the sensor pod with the camera in SDSU's Santa Margarita Ecological Reserve, and use it to track animal movement.