Our lab is developing video-based tools for quantitatively analyzing animal behavior. We use machine vision and learning methods for this purpose. Our analyses begin with video of the animals behaving. This video captures a large amount of information about the behavior of the animals, but is high-dimensional and cannot be interpreted quantitatively in its raw form. In our research, we are searching for scientifically meaningful, concise yet detailed, quantitative representations of the behavioral information contained in input videos.
These tools can be used to gain insight into nervous system function, evolution, and ethology. We use these tools to answer a wide variety of questions, using Drosophila as a model organism, such as:
- What is the behavioral effect of activating a small set of neurons?
- How do the behavior of different drosophilid species differ?
- How does a fly's social experience affect its behavior?
- What are the "rules" governing fly interactions?
Besides providing insight into basic questions in the life sciences, performing large-scale experiments to answer this diverse set of questions informs us as to the limitations of current techniques, and what new tools will have the biggest, broadest impact. Each of these experiments provides a weak signal of the "language" of behavior for Drosophila, and what methodologies are useful in illuminating this underlying structure.
We can think of video of animals behaving as a raw measure of their behavior. Clearly, much processing is necessary for this data to be scientifically interpretable. Tracking the animals - converting these videos into trajectories of their positions in each frame - goes a long way toward compressing the data into a more interpretable form. Simple statistics such as the average speed of an animal or the fraction of time spent in a certain part of the environment are illuminating, but there is more information in the data than this. What is a good representation derived from these trajectories to illustrate the behavioral effects of a given manipulation? What is a general representation that will efficiently and completely describe the behavioral effects of many scientific experiments? This question is related to the question, what is the behavioral vocabulary of a given model organism?
Our research focuses on using machine vision and learning to answer these questions. We are developing machine vision software for automatically tracking animals and developing algorithms for learning behavior detectors from manually segmented trajectories. Using these, we are encoding ethologists' knowledge of observed behavioral phenotypes. We are also using machine learning to automatically discover new behavioral phenotypes and statistics. Our data mining approaches use other signals of the neural state of an animal as a semi-supervised labeling of behavior, favoring correlations with, e.g., neuronal expression patterns in thousands of transgenic lines, neural recordings in freely behaving animals, and stimuli (or other measures of the environment of the animals) presented to the animals.
For the algorithms we develop, we are also focused on producing software that can be used by biologists without an in-depth knowledge of computer vision techniques. These algorithms must also generalize to many experiments and be accurate and robust enough to form the basis of scientific results.
We have developed a new machine learning-based system, JAABA, to enable researchers to automatically compute highly descriptive, interpretable, quantitative statistics of the behavior of animals. Through our system, the user encodes their intuition about the structure of behavior in their experiment by labeling the behavior of the animal, e.g. walking, grooming, or chasing, in a small set of video frames. JAABA uses machine learning techniques to convert these manual labels into behavior detectors that can then be used to automatically classify the behaviors of the animals in a large data set with high throughput. JAABA combines a powerful, fast machine learning method, an intuitive interface, and visualizations of the classifier to create an interactive, usable, general-purpose tool for training behavior classifiers. Combined with automatic tracking algorithms such as Ctrax, we envision that it will enable biologists to transform a qualitative understanding about behavioral differences into a quantitative statistic, then systematically look for signals only detectable through automatic, high-throughput, quantitative analysis.
Through a series of groundtruthing experiments, we showed that our system can be used by scientists without expertise in machine learning to independently train accurate behavior detectors. JAABA is general purpose, and we showed that it can be used to easily create a wide variety of accurate individual and social behavior detectors for three important model organisms: flies, larvae, and mice. We also showed that it can be used to create behavior classifiers robust enough to successfully be applied to a large, phenotypically diverse data: our neural activation screen data.
To create a new behavior classifier, the user begins by labeling the behavior of animals in a small number of frames in which they are certain of the correct behavior label. They then push the "Train" button to pass these labels to the machine learning algorithm, which, within a few seconds, creates an initial behavior classifier that predicts the behavior label in all frames. The user can then examine these results, and find and label frames for which the classifier is predicting incorrectly, and the user is confident of the correct label. They can then retrain the classifier, and repeat.
JAABA is a practical implementation of active learning, a subfield of machine learning in which only the most informative training examples are labeled. Traditionally, these are the unlabeled examples on which the current classifier is most unsure. Because of the fuzzy nature of behavior, frames on which the classifier is unsure are often frames for which the behavior label is truly ambiguous. Thus, we instead employ an interactive approach in which the user, aided by visualization and navigation tools for sifting through sets of videos of hundreds of animals and millions of frames, finds and labels frames for which they are certain of the correct label and the current classifier predicts incorrectly. This JAABA interface also increases the communication between the user and the learning algorithm, and allows users with little knowledge of machine learning to understand what the algorithm is capable of, and diagnose why a classifier is misclassifying a given frame.
We have developed a high-throughput system for quantifying the locomotion and social behavior of flies with both breadth and depth. This system was developed as part of the Fly Olympiad project at Janelia. We screened the behavioral effects of TrpA neural activation at a rate of 75 GAL4 lines per week over a period of 1.5 years.
In our system, we record video of groups of flies freely behaving in an open-field walking arena, the Fly Bowl. The Fly Bowl is a chamber for observing the locomotion and social behaviors of multiple flies. Its main difference from more standard arenas is that it was designed with automated tracking and behavior analysis algorithms in mind. We have made significant modifications to the original design to increase throughput, consistency, and image quality. These modifications allow us to use Ctrax to track individual flies’ positions accurately in a completely automated way. Our system consists of 8 bowls that we record from simultaneously.
To reduce disk storage, we developed a MATLAB-based data capture system which compresses our videos by a factor of 80 during recording and is lossless for the tracking algorithm (in our screen, we capture 11 TB of raw video data per week). Our data capture system also captures and monitors metadata information about the environment and the preparation of the flies to ensure that results are repeatable and data are comparable over long periods of time across the different rigs.
To provide oversight for collection of this large data set, we developed visualization tools for examining the stability of experimental conditions and behavior statistics over time, and ensuring that we understood and accounted for correlations between recorded metadata and behavior.
To analyze the data, we developed an automatic pipeline that uses the cluster at Janelia. Typically, data are analyzed within 24-hours of being collected and the results are stored in a database. The first step in our analysis is to track the positions of individual flies using an updated version of Ctrax. From the trajectories, we compute 85 time series of “per-frame” behavior measurements, for instance the fly’s speed in each frame, or the distance from the fly to the closest other fly in each frame. Our first level of characterization of the behavior of the flies are statistics of these measurements, in these examples, the average speed and the average distance to the closest fly, or histograms of these values.
Next, we use behavior classifiers trained using JAABA to compute discrete behavior labels for each fly and frame, labels of whether the fly is or is not performing each of a suite of behaviors, e.g. walking, chasing, and wing grooming. We can then represent the behavior of the flies in terms of the fraction of time that they perform a given behavior. We can also use these discrete behavior categories to segment the trajectories into similar types of data that can be analyzed together. Then, we can look at statistics of our per-frame measurements within these segments, e.g. average speed while chasing, or distance to the closest fly at the beginning of a jump. This allows us to remove the effects of common behaviors such as walking and stopping when scrutinizing less common but highly stereotyped behaviors such as courtship and grooming.
As part of the Fly Olympiad team project at Janelia, we performed a high-throughput, large-scale activation screen of 2,200 transgenic lines of adult Drosophila from the Rubin GAL4 collection. For each of these lines of flies, a different sparse subset of neurons express the temperature sensitive TRPA1 neural effector using the the GAL4-UAS system. When the flies are at an elevated temperature, this sparse subset of neurons are activated. Our goal in the Fly Bowl screen is to understand how exciting these neurons affects behavior. More specifically, for each line, we are producing a quantitative description of how the behavior of flies from this line differs from the behavior of flies from a genetic control. These GAL4 lines are widely used at Janelia and elsewhere, and the behavior annotation provided by our screen will be a useful initial behavioral characterization in many studies.
In addition, the Fly Light team project at Janelia has imaged the dissected nervous systems of flies from each of these GAL4 lines to determine the expression pattern. Using the Fly Bowl screen data and the Fly Light data, we are performing a meta-analysis to determine which brain regions and neural circuits are involved in what behaviors. This analysis has the potential to provide insight into the organization of the fly brain and the function of individual neurons and anatomical regions.
We have screened a total of 2,200 lines. For 70% of these lines, data was collected on at least 2 separate occasions (different crosses at different times of year). Each time a line was tested, we collected 4 videos of 10 male and 10 female flies for 1,000 seconds each. In addition, we collected videos of our control flies multiple times per day. We reliably screened at a rate of 75 GAL4 lines per week over a 1.5 year period. In total, we collected, tracked, and analyzed 225 days of video of a total of 380,000 flies. This data set has been curated using a set of automatic checks to look for errors during collection or processing. We thus have a large, interesting, high quality data set to mine.
We present a machine learning–based system for automatically computing interpretable, quantitative measures of animal behavior. Through our interactive system, users encode their intuition about behavior by annotating a small set of video frames. These manual labels are converted into classifiers that can automatically annotate behaviors in screen-scale data sets. Our general-purpose system can create a variety of accurate individual and social behavior classifiers for different organisms, including mice and adult and larval Drosophila.
An important role of visual systems is to detect nearby predators, prey, and potential mates , which may be distinguished in part by their motion. When an animal is at rest, an object moving in any direction may easily be detected by motion-sensitive visual circuits [2, 3]. During locomotion, however, this strategy is compromised because the observer must detect a moving object within the pattern of optic flow created by its own motion through the stationary background. However, objects that move creating back-to-front (regressive) motion may be unambiguously distinguished from stationary objects because forward locomotion creates only front-to-back (progressive) optic flow. Thus, moving animals should exhibit an enhanced sensitivity to regressively moving objects. We explicitly tested this hypothesis by constructing a simple fly-sized robot that was programmed to interact with a real fly. Our measurements indicate that whereas walking female flies freeze in response to a regressively moving object, they ignore a progressively moving one. Regressive motion salience also explains observations of behaviors exhibited by pairs of walking flies. Because the assumptions underlying the regressive motion salience hypothesis are general, we suspect that the behavior we have observed in Drosophila may be widespread among eyed, motile organisms.
Learning animal social behavior from trajectory featuresWorkshop on Visual Observation and Analysis of Animal and Insect Behavior 2012
E. Eyjolfsdottir, X. P. Burgos-Artizzu, S. Branson, K. Branson, D. Anderson, and P. Perona Workshop on Visual Observation and Analysis of Animal and Insect Behavior, (2012)
Prior Publications (3)
Automated tracking of animal movement allows analyses that would not otherwise be possible by providing great quantities of data. The additional capability of tracking in real time--with minimal latency--opens up the experimental possibility of manipulating sensory feedback, thus allowing detailed explorations of the neural basis for control of behaviour. Here, we describe a system capable of tracking the three-dimensional position and body orientation of animals such as flies and birds. The system operates with less than 40 ms latency and can track multiple animals simultaneously. To achieve these results, a multi-target tracking algorithm was developed based on the extended Kalman filter and the nearest neighbour standard filter data association algorithm. In one implementation, an 11-camera system is capable of tracking three flies simultaneously at 60 frames per second using a gigabit network of nine standard Intel Pentium 4 and Core 2 Duo computers. This manuscript presents the rationale and details of the algorithms employed and shows three implementations of the system. An experiment was performed using the tracking system to measure the effect of visual contrast on the flight speed of Drosophila melanogaster. At low contrasts, speed is more variable and faster on average than at high contrasts. Thus, the system is already a useful tool to study the neurobiology and behaviour of freely flying animals. If combined with other techniques, such as 'virtual reality'-type computer graphics or genetic manipulation, the tracking system would offer a powerful new way to investigate the biology of flying animals.
We present a camera-based method for automatically quantifying the individual and social behaviors of fruit flies, Drosophila melanogaster, interacting in a planar arena. Our system includes machine-vision algorithms that accurately track many individuals without swapping identities and classification algorithms that detect behaviors. The data may be represented as an ethogram that plots the time course of behaviors exhibited by each fly or as a vector that concisely captures the statistical properties of all behaviors displayed in a given period. We found that behavioral differences between individuals were consistent over time and were sufficient to accurately predict gender and genotype. In addition, we found that the relative positions of flies during social interactions vary according to gender, genotype and social environment. We expect that our software, which permits high-throughput screening, will complement existing molecular methods available in Drosophila, facilitating new investigations into the genetic and cellular basis of behavior.
We are seeking outstanding Postdoctoral Researchers to develop new algorithms for video-based analysis of animal behavior. In particular, we are looking for computer scientists with expertise in machine vision and learning interested in developing both novel algorithms and robust systems usable in cutting-edge biology research.
The application of machine vision and learning to biology is a growing field enabling important discoveries in neuroscience, genetics, drug discovery, and development. Possible projects include:
- Articulated model-based tracking for high-resolution video of behaving animals.
- Semi-supervised mining of behavioral structure from large data sets using a "human-in-the-loop" learning paradigm.
- Behavior detection using a combination of spatiotemporal features and pose tracking.
The successful candidate for this position will have:
- The creativity to develop new algorithms and interactive learning paradigms for using machine learning and vision for scientific discovery.
- Practical knowledge of the current state-of-the-art in machine vision and learning.
- The commitment and dedication to develop robust, working systems.
- Interest in biology applications of computer science and the new discoveries they enable.
- A PhD in Computer Science, Engineering, or a related field.
- Strong programming expertise in MATLAB and C++.
In the Branson Lab, the researcher will work closely with other researchers at Janelia to progress toward the following goals:
- Developing video-based tools for quantitatively analyzing animal behavior that can be used to greatly advance our knowledge of neuroscience, genetics, and their connection to ethology.
- Developing new methods for non-computer scientists to use machine learning and vision tools for scientific discovery.
- Discovering the vocabulary of fruit-fly behavior, and relating it to the structure and function of the animal's nervous system and genetics.
- Understanding the mathematical structure of behavior, and developing tools and heuristics to extract this structure which can eventually be applied to human behavior.
HHMI's Janelia Farm Research Campus is a pioneering research center near Washington, D.C., where scientists from many disciplines develop and use emerging and innovative technologies to pursue science's most challenging problems. Established in 2006, Janelia was modeled after institutes like Bell Labs, with small groups collaborating on high-risk, high-reward, innovative science.
Applicants should email a CV and cover letter summarizing their research experience and interests to Kristin Branson at firstname.lastname@example.org.