Representing Honey Bee Behavior for Recognition Using Human Trainable Models

  • Authors:
  • Adam Feldman;Tucker Balch

  • Affiliations:
  • College of Computing, Georgia Institute of Technology;College of Computing, Georgia Institute of Technology

  • Venue:
  • Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Identifying and recording subject movements is a critical, but time-consuming step in animal behavior research. The task is especially onerous in studies involving social insects because of the number of animals that must be observed simultaneously. To address this, we present a system that can automatically analyze animal movements, and label them, by creating a behavioral model from examples provided by a human expert. Further, in conjunction with identifying movements, our system also recognizes the behaviors made up of these movements. Thus, with only a small training set of hand labeled data, the system automatically completes the entire behavioral modeling and labeling process. For our experiments, activity in an observation hive is recorded on video, that video is converted into location information for each animal by a vision-based tracker, and then numerical features such as velocity and heading change are extracted. The features are used in turn to label the sequence of movements for each observed animal, according to the model. Our approach uses a combination of kernel regression classification and hidden Markov model (HMM) techniques. The system was evaluated on several hundred honey bee trajectories extracted from a 15 minute video of activity in an observation hive.