Main Menu (Mobile)- Block

Main Menu - Block

custom | custom

Search Results

filters_region_cap | custom

Filter

facetapi-Q2b17qCsTdECvJIqZJgYMaGsr8vANl1n | block
facetapi-W9JlIB1X0bjs93n1Alu3wHJQTTgDCBGe | block
facetapi-61yz1V0li8B1bixrCWxdAe2aYiEXdhd0 | block
facetapi-PV5lg7xuz68EAY8eakJzrcmwtdGEnxR0 | block
general_search_page-panel_pane_1 | views_panes

45 Janelia Publications

Showing 21-30 of 45 results
Your Criteria:
    06/26/18 | Honeybee detection and pose estimation using convolutional neural networks.
    Rodriguez IF, Branson KM, Acuna E, Agosto-Rivera J, Giray T, Megret R
    RFIAP 2018. 2018 Jun 26:

    The ability to automatize the analysis of video for monitoring animals and insects is of great interest for behavior science and ecology [1]. In particular, honeybees play a crucial role in agriculture as natural pollinators. However, recent studies has shown that phenomena such as colony collapse disorder are causing the loss of many colonies [2]. Due to the high number of interacting factors to explain these events, a multi-faceted analysis of the bees in their environment is required. We focus in our work in developing tools to help model and understand their behavior as individuals, in relation with the health and performance of the colony.

    In this paper, we report the development of a new system for the detection, locali- zation and tracking of honeybee body parts from video on the entrance ramp of the colony. The proposed system builds on the recent advances in Convolutional Neu- ral Networks (CNN) for Human pose estimation and evaluates the suitability for the detection of honeybee pose as shown in Figure 1. This opens the door for novel animal behavior analysis systems that take advantage of the precise detection and tracking of the insect pose. 

    View Publication Page
    Branson LabFreeman Lab
    10/22/15 | Imaging the neural basis of locomotion.
    Branson K, Freeman J
    Cell. 2015 Oct 22;163(3):541-2. doi: 10.1016/j.cell.2015.10.014

    To investigate the fundamental question of how nervous systems encode, organize, and sequence behaviors, Kato et al. imaged neural activity with cellular resolution across the brain of the worm Caenorhabditis elegans. Locomotion behavior seems to be continuously represented by cyclical patterns of distributed neural activity that are present even in immobilized animals.

    View Publication Page
    10/24/19 | Importance Weighted Adversarial Variational Autoencoders for Spike Inference from Calcium Imaging Data
    Daniel Jiwoong Im , Sridhama Prakhya , Jinyao Yan , Srinivas C. Turaga , Kristin Branson
    CoRR. 10/2019;abs/1906.03214:

    The Importance Weighted Auto Encoder (IWAE) objective has been shown to improve the training of generative models over the standard Variational Auto Encoder (VAE) objective. Here, we derive importance weighted extensions to Adversarial Variational Bayes (AVB) and Adversarial Autoencoder (AAE). These latent variable models use implicitly defined inference networks whose approximate posterior density qφ(z|x) cannot be directly evaluated, an essential ingredient for importance weighting. We show improved training and inference in latent variable models with our adversarially trained importance weighting method, and derive new theoretical connections between adversarial generative model training criteria and marginal likelihood based methods. We apply these methods to the important problem of inferring spiking neural activity from calcium imaging data, a challenging posterior inference problem in neuroscience, and show that posterior samples from the adversarial methods outperform factorized posteriors used in VAEs.

    View Publication Page
    10/18/18 | In toto imaging and reconstruction of post-implantation mouse development at the single-cell level.
    McDole K, Guignard L, Amat F, Berger A, Malandain G, Royer LA, Turaga SC, Branson K, Keller PJ
    Cell. 2018 Oct 10;175(3):859-876. doi: 10.1016/j.cell.2018.09.031

    The mouse embryo has long been central to the study of mammalian development; however, elucidating the cell behaviors governing gastrulation and the formation of tissues and organs remains a fundamental challenge. A major obstacle is the lack of live imaging and image analysis technologies capable of systematically following cellular dynamics across the developing embryo. We developed a light-sheet microscope that adapts itself to the dramatic changes in size, shape, and optical properties of the post-implantation mouse embryo and captures its development from gastrulation to early organogenesis at the cellular level. We furthermore developed a computational framework for reconstructing long-term cell tracks, cell divisions, dynamic fate maps, and maps of tissue morphogenesis across the entire embryo. By jointly analyzing cellular dynamics in multiple embryos registered in space and time, we built a dynamic atlas of post-implantation mouse development that, together with our microscopy and computational methods, is provided as a resource.

    View Publication Page
    12/01/12 | JAABA: interactive machine learning for automatic annotation of animal behavior.
    Kabra M, Robie AA, Rivera-Alba M, Branson S, Branson K
    Nature Methods. 2012 Dec;10:64-7

    We present a machine learning–based system for automatically computing interpretable, quantitative measures of animal behavior. Through our interactive system, users encode their intuition about behavior by annotating a small set of video frames. These manual labels are converted into classifiers that can automatically annotate behaviors in screen-scale data sets. Our general-purpose system can create a variety of accurate individual and social behavior classifiers for different organisms, including mice and adult and larval Drosophila.

    View Publication Page
    10/31/16 | Learning a metric for class-conditional KNN.
    Im DJ, Taylor GW
    International Joint Conference on Neural Networks, IJCNN 2016. 2016 Oct 31:. doi: 10.1109/IJCNN.2016.7727436

    Naïve Bayes Nearest Neighbour (NBNN) is a simple and effective framework which addresses many of the pitfalls of K-Nearest Neighbour (KNN) classification. It has yielded competitive results on several computer vision benchmarks. Its central tenet is that during NN search, a query is not compared to every example in a database, ignoring class information. Instead, NN searches are performed within each class, generating a score per class. A key problem with NN techniques, including NBNN, is that they fail when the data representation does not capture perceptual (e.g. class-based) similarity. NBNN circumvents this by using independent engineered descriptors (e.g. SIFT). To extend its applicability outside of image-based domains, we propose to learn a metric which captures perceptual similarity. Similar to how Neighbourhood Components Analysis optimizes a differentiable form of KNN classification, we propose 'Class Conditional' metric learning (CCML), which optimizes a soft form of the NBNN selection rule. Typical metric learning algorithms learn either a global or local metric. However, our proposed method can be adjusted to a particular level of locality by tuning a single parameter. An empirical evaluation on classification and retrieval tasks demonstrates that our proposed method clearly outperforms existing learned distance metrics across a variety of image and non-image datasets.

    View Publication Page
    11/01/12 | Learning animal social behavior from trajectory features.
    Eyjolfsdottir E, Burgos-Artizzu XP, Branson S, Branson K, Anderson D, Perona P
    Workshop on Visual Observation and Analysis of Animal and Insect Behavior. 2012 Nov:
    11/01/16 | Learning recurrent representations for hierarchical behavior modeling.
    Eyjolfsdottir E, Branson K, Yue Y, Perona P
    arXiv. 2016 Nov 1;arXiv:1611.00094(arXiv:1611.00094):

    We propose a framework for detecting action patterns from motion sequences and modeling the sensory-motor relationship of animals, using a generative recurrent neural network. The network has a discriminative part (classifying actions) and a generative part (predicting motion), whose recurrent cells are laterally connected, allowing higher levels of the network to represent high level phenomena. We test our framework on two types of data, fruit fly behavior and online handwriting. Our results show that 1) taking advantage of unlabeled sequences, by predicting future motion, significantly improves action detection performance when training labels are scarce, 2) the network learns to represent high level phenomena such as writer identity and fly gender, without supervision, and 3) simulated motion trajectories, generated by treating motion prediction as input to the network, look realistic and may be used to qualitatively evaluate whether the model has learnt generative control rules.

    View Publication Page
    01/01/17 | Machine vision methods for analyzing social interactions.
    Robie AA, Seagraves KM, Egnor SE, Branson K
    The Journal of Experimental Biology. 2017 Jan 01;220(Pt 1):25-34. doi: 10.1242/jeb.142281

    Recent developments in machine vision methods for automatic, quantitative analysis of social behavior have immensely improved both the scale and level of resolution with which we can dissect interactions between members of the same species. In this paper, we review these methods, with a particular focus on how biologists can apply them to their own work. We discuss several components of machine vision-based analyses: methods to record high-quality video for automated analyses, video-based tracking algorithms for estimating the positions of interacting animals, and machine learning methods for recognizing patterns of interactions. These methods are extremely general in their applicability, and we review a subset of successful applications of them to biological questions in several model systems with very different types of social behaviors.

    View Publication Page
    07/13/17 | Mapping the neural substrates of behavior.
    Robie AA, Hirokawa J, Edwards AW, Umayam LA, Lee A, Phillips ML, Card GM, Korff W, Rubin GM, Simpson JH, Reiser MB, Branson KM
    Cell. 2017-07-13;170(2):393-406. doi: 10.1016/j.cell.2017.06.032

    Assigning behavioral functions to neural structures has long been a central goal in neuroscience and is a necessary first step toward a circuit-level understanding of how the brain generates behavior. Here, we map the neural substrates of locomotion and social behaviors for Drosophila melanogaster using automated machine-vision and machine-learning techniques. From videos of 400,000 flies, we quantified the behavioral effects of activating 2,204 genetically targeted populations of neurons. We combined a novel quantification of anatomy with our behavioral analysis to create brain-behavior correlation maps, which are shared as browsable web pages and interactive software. Based on these maps, we generated hypotheses of regions of the brain causally related to sensory processing, locomotor control, courtship, aggression, and sleep. Our maps directly specify genetic tools to target these regions, which we used to identify a small population of neurons with a role in the control of walking.

    •We developed machine-vision methods to broadly and precisely quantify fly behavior•We measured effects of activating 2,204 genetically targeted neuronal populations•We created whole-brain maps of neural substrates of locomotor and social behaviors•We created resources for exploring our results and enabling further investigation

    Machine-vision analyses of large behavior and neuroanatomy data reveal whole-brain maps of regions associated with numerous complex behaviors.

    View Publication Page