Main Menu (Mobile)- Block

Main Menu - Block

janelia7_blocks-janelia7_secondary_menu | block
janelia7_blocks-janelia7_fake_breadcrumb | block
Branson Lab / Publications
custom | custom

Filter

facetapi-Q2b17qCsTdECvJIqZJgYMaGsr8vANl1n | block
facetapi-W9JlIB1X0bjs93n1Alu3wHJQTTgDCBGe | block
facetapi-PV5lg7xuz68EAY8eakJzrcmwtdGEnxR0 | block
facetapi-021SKYQnqXW6ODq5W5dPAFEDBaEJubhN | block
general_search_page-panel_pane_1 | views_panes

48 Publications

Showing 11-20 of 48 results
10/09/19 | Computational neuroethology: A call to action.
Datta SR, Anderson DJ, Branson K, Perona P, Leifer A
Neuron. 2019 Oct 09;104(1):11-24. doi: 10.1016/j.neuron.2019.09.038

The brain is worthy of study because it is in charge of behavior. A flurry of recent technical advances in measuring and quantifying naturalistic behaviors provide an important opportunity for advancing brain science. However, the problem of understanding unrestrained behavior in the context of neural recordings and manipulations remains unsolved, and developing approaches to addressing this challenge is critical. Here we discuss considerations in computational neuroethology-the science of quantifying naturalistic behaviors for understanding the brain-and propose strategies to evaluate progress. We point to open questions that require resolution and call upon the broader systems neuroscience community to further develop and leverage measures of naturalistic, unrestrained behavior, which will enable us to more effectively probe the richness and complexity of the brain.

View Publication Page
08/12/19 | An automatic behavior recognition system classifies animal behaviors using movements and their temporal context.
Ravbar P, Branson K, Simpson JH
Journal of Neuroscience Methods. 2019 Aug 12;326:108352. doi: 10.1016/j.jneumeth.2019.108352

Animals can perform complex and purposeful behaviors by executing simpler movements in flexible sequences. It is particularly challenging to analyze behavior sequences when they are highly variable, as is the case in language production, certain types of birdsong and, as in our experiments, flies grooming. High sequence variability necessitates rigorous quantification of large amounts of data to identify organizational principles and temporal structure of such behavior. To cope with large amounts of data, and minimize human effort and subjective bias, researchers often use automatic behavior recognition software. Our standard grooming assay involves coating flies in dust and videotaping them as they groom to remove it. The flies move freely and so perform the same movements in various orientations. As the dust is removed, their appearance changes. These conditions make it difficult to rely on precise body alignment and anatomical landmarks such as eyes or legs and thus present challenges to existing behavior classification software. Human observers use speed, location, and shape of the movements as the diagnostic features of particular grooming actions. We applied this intuition to design a new automatic behavior recognition system (ABRS) based on spatiotemporal features in the video data, heavily weighted for temporal dynamics and invariant to the animal’s position and orientation in the scene. We use these spatiotemporal features in two steps of supervised classification that reflect two time-scales at which the behavior is structured. As a proof of principle, we show results from quantification and analysis of a large data set of stimulus-induced fly grooming behaviors that would have been difficult to assess in a smaller dataset of human-annotated ethograms. While we developed and validated this approach to analyze fly grooming behavior, we propose that the strategy of combining alignment-invariant features and multi-timescale analysis may be generally useful for movement-based classification of behavior from video data.

View Publication Page
Branson LabCard Lab
07/01/19 | State-dependent decoupling of sensory and motor circuits underlies behavioral flexibility in Drosophila.
Ache JM, Namiki S, Lee A, Branson K, Card GM
Nature Neuroscience. 2019 Jul 01;22(7):1132-1139. doi: 10.1038/s41593-019-0413-4

An approaching predator and self-motion toward an object can generate similar looming patterns on the retina, but these situations demand different rapid responses. How central circuits flexibly process visual cues to activate appropriate, fast motor pathways remains unclear. Here we identify two descending neuron (DN) types that control landing and contribute to visuomotor flexibility in Drosophila. For each, silencing impairs visually evoked landing, activation drives landing, and spike rate determines leg extension amplitude. Critically, visual responses of both DNs are severely attenuated during non-flight periods, effectively decoupling visual stimuli from the landing motor pathway when landing is inappropriate. The flight-dependence mechanism differs between DN types. Octopamine exposure mimics flight effects in one, whereas the other probably receives neuronal feedback from flight motor circuits. Thus, this sensorimotor flexibility arises from distinct mechanisms for gating action-specific descending pathways, such that sensory and motor networks are coupled or decoupled according to the behavioral state.

View Publication Page
12/14/18 | Motor cortex is an input-driven dynamical system controlling dexterous movement.
Sauerbrei B, Guo J, Mischiati M, Guo W, Kabra M, Verma N, Branson KM, Hantman AW
bioRxiv. 2018-12-14:266320. doi: 10.1101/266320

Skillful control of movement is central to our ability to sense and manipulate the world. A large body of work in nonhuman primates has demonstrated that motor cortex provides flexible, time-varying activity patterns that control the arm during reaching and grasping. Previous studies have suggested that these patterns are generated by strong local recurrent dynamics operating autonomously from inputs during movement execution. An alternative possibility is that motor cortex requires coordination with upstream brain regions throughout the entire movement in order to yield these patterns. Here, we developed an experimental preparation in the mouse to directly test these possibilities using optogenetics and electrophysiology during a skilled reach-to-grab-to-eat task. To validate this preparation, we first established that a specific, time-varying pattern of motor cortical activity was required to produce coordinated movement. Next, in order to disentangle the contribution of local recurrent motor cortical dynamics from external input, we optogenetically held the recurrent contribution constant, then observed how motor cortical activity recovered following the end of this perturbation. Both the neural responses and hand trajectory varied from trial to trial, and this variability reflected variability in external inputs. To directly probe the role of these inputs, we used optogenetics to perturb activity in the thalamus. Thalamic perturbation at the start of the trial prevented movement initiation, and perturbation at any stage of the movement prevented progression of the hand to the target; this demonstrates that input is required throughout the movement. By comparing motor cortical activity with and without thalamic perturbation, we were able to estimate the effects of external inputs on motor cortical population activity. Thus, unlike pattern-generating circuits that are local and autonomous, such as those in the spinal cord that generate left-right alternation during locomotion, the pattern generator for reaching and grasping is distributed across multiple, strongly-interacting brain regions.

View Publication Page
10/18/18 | In toto imaging and reconstruction of post-implantation mouse development at the single-cell level.
McDole K, Guignard L, Amat F, Berger A, Malandain G, Royer LA, Turaga SC, Branson K, Keller PJ
Cell. 2018 Oct 10;175(3):859-876. doi: 10.1016/j.cell.2018.09.031

The mouse embryo has long been central to the study of mammalian development; however, elucidating the cell behaviors governing gastrulation and the formation of tissues and organs remains a fundamental challenge. A major obstacle is the lack of live imaging and image analysis technologies capable of systematically following cellular dynamics across the developing embryo. We developed a light-sheet microscope that adapts itself to the dramatic changes in size, shape, and optical properties of the post-implantation mouse embryo and captures its development from gastrulation to early organogenesis at the cellular level. We furthermore developed a computational framework for reconstructing long-term cell tracks, cell divisions, dynamic fate maps, and maps of tissue morphogenesis across the entire embryo. By jointly analyzing cellular dynamics in multiple embryos registered in space and time, we built a dynamic atlas of post-implantation mouse development that, together with our microscopy and computational methods, is provided as a resource.

View Publication Page
08/20/18 | Multiple animals tracking in video using part affinity fields
Rodriguez IF, Megret R, Egnor R, Branson K, Agosto JL, Giray T, Acuna E
Visual observation and analysis of Vertebrate And Insect Behavior 2018. 2018 Aug 20:

In this work, we address the problem of pose detection and tracking of multiple individuals for the study of behaviour in insects and animals. Using a Deep Neural Network architecture, precise detection and association of the body parts can be performed. The models are learned based on user-annotated training videos, which gives flexibility to the approach. This is illustrated on two different animals: honeybees and mice, where very good performance in part recognition and association are observed despite the presence of multiple interacting individuals.

View Publication Page
06/26/18 | Honeybee detection and pose estimation using convolutional neural networks.
Rodriguez IF, Branson KM, Acuna E, Agosto-Rivera J, Giray T, Megret R
RFIAP 2018. 2018 Jun 26:

The ability to automatize the analysis of video for monitoring animals and insects is of great interest for behavior science and ecology [1]. In particular, honeybees play a crucial role in agriculture as natural pollinators. However, recent studies has shown that phenomena such as colony collapse disorder are causing the loss of many colonies [2]. Due to the high number of interacting factors to explain these events, a multi-faceted analysis of the bees in their environment is required. We focus in our work in developing tools to help model and understand their behavior as individuals, in relation with the health and performance of the colony.

In this paper, we report the development of a new system for the detection, locali- zation and tracking of honeybee body parts from video on the entrance ramp of the colony. The proposed system builds on the recent advances in Convolutional Neu- ral Networks (CNN) for Human pose estimation and evaluates the suitability for the detection of honeybee pose as shown in Figure 1. This opens the door for novel animal behavior analysis systems that take advantage of the precise detection and tracking of the insect pose. 

View Publication Page
04/03/18 | A deep (learning) dive into a cell.
Branson K
Nature Methods. 2018 Apr 03;15(4):253-4. doi: 10.1038/nmeth.4658
11/02/17 | Network-size independent covering number bounds for deep networks.
Kabra M, Branson KM
arXiv. 2017 Nov 02:arXiv:1711.00753

We give a covering number bound for deep learning networks that is independent of the size of the network. The key for the simple analysis is that for linear classifiers, rotating the data doesn't affect the covering number. Thus, we can ignore the rotation part of each layer's linear transformation, and get the covering number bound by concentrating on the scaling part.

View Publication Page
07/13/17 | Mapping the neural substrates of behavior.
Robie AA, Hirokawa J, Edwards AW, Umayam LA, Lee A, Phillips ML, Card GM, Korff W, Rubin GM, Simpson JH, Reiser MB, Branson KM
Cell. 2017-07-13;170(2):393-406. doi: 10.1016/j.cell.2017.06.032

Assigning behavioral functions to neural structures has long been a central goal in neuroscience and is a necessary first step toward a circuit-level understanding of how the brain generates behavior. Here, we map the neural substrates of locomotion and social behaviors for Drosophila melanogaster using automated machine-vision and machine-learning techniques. From videos of 400,000 flies, we quantified the behavioral effects of activating 2,204 genetically targeted populations of neurons. We combined a novel quantification of anatomy with our behavioral analysis to create brain-behavior correlation maps, which are shared as browsable web pages and interactive software. Based on these maps, we generated hypotheses of regions of the brain causally related to sensory processing, locomotor control, courtship, aggression, and sleep. Our maps directly specify genetic tools to target these regions, which we used to identify a small population of neurons with a role in the control of walking.

•We developed machine-vision methods to broadly and precisely quantify fly behavior•We measured effects of activating 2,204 genetically targeted neuronal populations•We created whole-brain maps of neural substrates of locomotor and social behaviors•We created resources for exploring our results and enabling further investigation

Machine-vision analyses of large behavior and neuroanatomy data reveal whole-brain maps of regions associated with numerous complex behaviors.

View Publication Page