news

Vision System Automates Analysis of Bee Activity for Insight into Biologically Inspired Robot Design

Primary tabs

The animal movement analysis system is part of the BioTracking Project, an effort conducted by Georgia Institute of Technology robotics researchers led by Tucker Balch, an assistant professor of computing.

"We believe the language of behavior is common between robots and animals," Balch said. "That means, potentially, that we could videotape ants for a long period of time, learn their 'program' and run it on a robot."

Social insects, such as ants and bees, represent the existence of successful large-scale, robust behavior forged from the interaction of many, simple individuals, Balch explained. Such behavior can offer ideas on how to organize a cooperating colony of robots capable of complex operations.

To expedite the understanding of such behavior, Balch's team developed a computer vision system that automates analysis of animal movement - once an arduous and time-consuming task. Researchers are using the system to analyze data on the sequential movements that encode information - for example in bees, the location of distant food sources, Balch said. He will present the research at the Second International Workshop on the Mathematics and Algorithms of Social Insects on Dec. 16-17 at Georgia Tech.

With an 81.5 percent accuracy rate, the system can automatically analyze bee movements and label them based on examples provided by human experts. This level of labeling accuracy is high enough to allow researchers to build a subsequent system to accurately determine the behavior of a bee from its sequence of motions, Balch explained.

For example, one sequence of motions bees commonly perform are waggle dances consisting of arcing to the right, waggling (walking in a generally straight line while oscillating left and right), arcing to the left, waggling and so on. These motions encode the locations of distant food sources, according to Cornell University Professor of Biology Thomas Seeley, who has collaborated with Balch on this project. Balch is also working with Professor Deborah Gordon of Stanford University on related work with ants.

Balch's animal movement analysis system has several components. First, researchers shoot 15 minutes of videotape of bees - some of which are marked with a bright-colored paint and returned to an observation hive. Then computer vision-based tracking software converts the video of the marked bees into x- and y-coordinate location information for each animal in each frame of the footage. Some segments of this data are hand labeled by a researcher and then used as motion examples for the automated analysis system.

In future work, Balch and his colleagues will build a system that can learn executable models of these behaviors and then run the models in simulation. These simulations, Balch explained, would reveal the accuracy of the models. Researchers don't yet know if these models will yield better computer programming algorithms, though they are hopeful based on what previous research has revealed.

Status

  • Workflow Status:Published
  • Created By:Matthew Nagel
  • Created:12/08/2003
  • Modified By:Fletcher Moore
  • Modified:10/07/2016

Categories

  • No categories were selected.

Keywords

  • No keywords were submitted.