Main Menu (Mobile)- Block

Main Menu - Block

custom | custom

Search Results

filters_region_cap | custom


facetapi-Q2b17qCsTdECvJIqZJgYMaGsr8vANl1n | block
facetapi-W9JlIB1X0bjs93n1Alu3wHJQTTgDCBGe | block
facetapi-61yz1V0li8B1bixrCWxdAe2aYiEXdhd0 | block
facetapi-PV5lg7xuz68EAY8eakJzrcmwtdGEnxR0 | block
facetapi-aK0bSsPXQOqhYQEgonL2xGNrv4SPvFLb | block

Tool Types

general_search_page-panel_pane_1 | views_panes

38 Janelia Publications

Showing 1-10 of 38 results
Your Criteria:
    11/03/20 | Cell types and neuronal circuitry underlying female aggression in .
    Schretter CE, Aso Y, Robie AA, Dreher M, Dolan M, Chen N, Ito M, Yang T, Parekh R, Branson KM, Rubin GM
    eLife. 2020 Nov 03;9:. doi: 10.7554/eLife.58942

    Aggressive social interactions are used to compete for limited resources and are regulated by complex sensory cues and the organism's internal state. While both sexes exhibit aggression, its neuronal underpinnings are understudied in females. Here, we identify a population of sexually dimorphic aIPg neurons in the adult central brain whose optogenetic activation increased, and genetic inactivation reduced, female aggression. Analysis of GAL4 lines identified in an unbiased screen for increased female chasing behavior revealed the involvement of another sexually dimorphic neuron, pC1d, and implicated aIPg and pC1d neurons as core nodes regulating female aggression. Connectomic analysis demonstrated that aIPg neurons and pC1d are interconnected and suggest that aIPg neurons may exert part of their effect by gating the flow of visual information to descending neurons. Our work reveals important regulatory components of the neuronal circuitry that underlies female aggressive social interactions and provides tools for their manipulation.

    View Publication Page
    05/14/20 | Detecting the Starting Frame of Actions in Video
    Kwak IS, Guo J, Hantman A, Branson K, Kriegman D
    2020 IEEE Winter Conference on Applications of Computer Vision (WACV). 2020 May 14:. doi: 10.1109/WACV45572.202010.1109/WACV45572.2020.9093405

    In this work, we address the problem of precisely localizing key frames of an action, for example, the precise time that a pitcher releases a baseball, or the precise time that a crowd begins to applaud. Key frame localization is a largely overlooked and important action-recognition problem, for example in the field of neuroscience, in which we would like to understand the neural activity that produces the start of a bout of an action. To address this problem, we introduce a novel structured loss function that properly weights the types of errors that matter in such applications: it more heavily penalizes extra and missed action start detections over small misalignments. Our structured loss is based on the best matching between predicted and labeled action starts. We train recurrent neural networks (RNNs) to minimize differentiable approximations of this loss. To evaluate these methods, we introduce the Mouse Reach Dataset, a large, annotated video dataset of mice performing a sequence of actions. The dataset was collected and labeled by experts for the purpose of neuroscience research. On this dataset, we demonstrate that our method outperforms related approaches and baseline methods using an unstructured loss.

    View Publication Page
    01/16/20 | Cortical pattern generation during dexterous movement is input-driven.
    Sauerbrei BA, Guo J, Cohen JD, Mischiati M, Guo W, Kabra M, Verma N, Mensh B, Branson K, Hantman AW
    Nature. 2020 Jan 16;577(7790):386-91. doi: 10.1038/s41586-019-1869-9

    The motor cortex controls skilled arm movement by sending temporal patterns of activity to lower motor centres. Local cortical dynamics are thought to shape these patterns throughout movement execution. External inputs have been implicated in setting the initial state of the motor cortex, but they may also have a pattern-generating role. Here we dissect the contribution of local dynamics and inputs to cortical pattern generation during a prehension task in mice. Perturbing cortex to an aberrant state prevented movement initiation, but after the perturbation was released, cortex either bypassed the normal initial state and immediately generated the pattern that controls reaching or failed to generate this pattern. The difference in these two outcomes was probably a result of external inputs. We directly investigated the role of inputs by inactivating the thalamus; this perturbed cortical activity and disrupted limb kinematics at any stage of the movement. Activation of thalamocortical axon terminals at different frequencies disrupted cortical activity and arm movement in a graded manner. Simultaneous recordings revealed that both thalamic activity and the current state of cortex predicted changes in cortical activity. Thus, the pattern generator for dexterous arm movement is distributed across multiple, strongly interacting brain regions.

    View Publication Page
    10/09/19 | Computational neuroethology: A call to action.
    Datta SR, Anderson DJ, Branson K, Perona P, Leifer A
    Neuron. 2019 Oct 09;104(1):11-24. doi: 10.1016/j.neuron.2019.09.038

    The brain is worthy of study because it is in charge of behavior. A flurry of recent technical advances in measuring and quantifying naturalistic behaviors provide an important opportunity for advancing brain science. However, the problem of understanding unrestrained behavior in the context of neural recordings and manipulations remains unsolved, and developing approaches to addressing this challenge is critical. Here we discuss considerations in computational neuroethology-the science of quantifying naturalistic behaviors for understanding the brain-and propose strategies to evaluate progress. We point to open questions that require resolution and call upon the broader systems neuroscience community to further develop and leverage measures of naturalistic, unrestrained behavior, which will enable us to more effectively probe the richness and complexity of the brain.

    View Publication Page
    08/12/19 | An automatic behavior recognition system classifies animal behaviors using movements and their temporal context.
    Ravbar P, Branson K, Simpson JH
    Journal of Neuroscience Methods. 2019 Aug 12;326:108352. doi: 10.1016/j.jneumeth.2019.108352

    Animals can perform complex and purposeful behaviors by executing simpler movements in flexible sequences. It is particularly challenging to analyze behavior sequences when they are highly variable, as is the case in language production, certain types of birdsong and, as in our experiments, flies grooming. High sequence variability necessitates rigorous quantification of large amounts of data to identify organizational principles and temporal structure of such behavior. To cope with large amounts of data, and minimize human effort and subjective bias, researchers often use automatic behavior recognition software. Our standard grooming assay involves coating flies in dust and videotaping them as they groom to remove it. The flies move freely and so perform the same movements in various orientations. As the dust is removed, their appearance changes. These conditions make it difficult to rely on precise body alignment and anatomical landmarks such as eyes or legs and thus present challenges to existing behavior classification software. Human observers use speed, location, and shape of the movements as the diagnostic features of particular grooming actions. We applied this intuition to design a new automatic behavior recognition system (ABRS) based on spatiotemporal features in the video data, heavily weighted for temporal dynamics and invariant to the animal’s position and orientation in the scene. We use these spatiotemporal features in two steps of supervised classification that reflect two time-scales at which the behavior is structured. As a proof of principle, we show results from quantification and analysis of a large data set of stimulus-induced fly grooming behaviors that would have been difficult to assess in a smaller dataset of human-annotated ethograms. While we developed and validated this approach to analyze fly grooming behavior, we propose that the strategy of combining alignment-invariant features and multi-timescale analysis may be generally useful for movement-based classification of behavior from video data.

    View Publication Page
    07/01/19 | State-dependent decoupling of sensory and motor circuits underlies behavioral flexibility in Drosophila.
    Ache JM, Namiki S, Lee A, Branson K, Card GM
    Nature Neuroscience. 2019 Jul 01;22(7):1132-1139. doi: 10.1038/s41593-019-0413-4

    An approaching predator and self-motion toward an object can generate similar looming patterns on the retina, but these situations demand different rapid responses. How central circuits flexibly process visual cues to activate appropriate, fast motor pathways remains unclear. Here we identify two descending neuron (DN) types that control landing and contribute to visuomotor flexibility in Drosophila. For each, silencing impairs visually evoked landing, activation drives landing, and spike rate determines leg extension amplitude. Critically, visual responses of both DNs are severely attenuated during non-flight periods, effectively decoupling visual stimuli from the landing motor pathway when landing is inappropriate. The flight-dependence mechanism differs between DN types. Octopamine exposure mimics flight effects in one, whereas the other probably receives neuronal feedback from flight motor circuits. Thus, this sensorimotor flexibility arises from distinct mechanisms for gating action-specific descending pathways, such that sensory and motor networks are coupled or decoupled according to the behavioral state.

    View Publication Page
    12/14/18 | Motor cortex is an input-driven dynamical system controlling dexterous movement.
    Sauerbrei B, Guo J, Mischiati M, Guo W, Kabra M, Verma N, Branson KM, Hantman AW
    bioRxiv. 2018-12-14:266320. doi: 10.1101/266320

    Skillful control of movement is central to our ability to sense and manipulate the world. A large body of work in nonhuman primates has demonstrated that motor cortex provides flexible, time-varying activity patterns that control the arm during reaching and grasping. Previous studies have suggested that these patterns are generated by strong local recurrent dynamics operating autonomously from inputs during movement execution. An alternative possibility is that motor cortex requires coordination with upstream brain regions throughout the entire movement in order to yield these patterns. Here, we developed an experimental preparation in the mouse to directly test these possibilities using optogenetics and electrophysiology during a skilled reach-to-grab-to-eat task. To validate this preparation, we first established that a specific, time-varying pattern of motor cortical activity was required to produce coordinated movement. Next, in order to disentangle the contribution of local recurrent motor cortical dynamics from external input, we optogenetically held the recurrent contribution constant, then observed how motor cortical activity recovered following the end of this perturbation. Both the neural responses and hand trajectory varied from trial to trial, and this variability reflected variability in external inputs. To directly probe the role of these inputs, we used optogenetics to perturb activity in the thalamus. Thalamic perturbation at the start of the trial prevented movement initiation, and perturbation at any stage of the movement prevented progression of the hand to the target; this demonstrates that input is required throughout the movement. By comparing motor cortical activity with and without thalamic perturbation, we were able to estimate the effects of external inputs on motor cortical population activity. Thus, unlike pattern-generating circuits that are local and autonomous, such as those in the spinal cord that generate left-right alternation during locomotion, the pattern generator for reaching and grasping is distributed across multiple, strongly-interacting brain regions.

    View Publication Page
    10/18/18 | In toto imaging and reconstruction of post-implantation mouse development at the single-cell level.
    McDole K, Guignard L, Amat F, Berger A, Malandain G, Royer LA, Turaga SC, Branson K, Keller PJ
    Cell. 2018 Oct 10;175(3):859-876. doi: 10.1016/j.cell.2018.09.031

    The mouse embryo has long been central to the study of mammalian development; however, elucidating the cell behaviors governing gastrulation and the formation of tissues and organs remains a fundamental challenge. A major obstacle is the lack of live imaging and image analysis technologies capable of systematically following cellular dynamics across the developing embryo. We developed a light-sheet microscope that adapts itself to the dramatic changes in size, shape, and optical properties of the post-implantation mouse embryo and captures its development from gastrulation to early organogenesis at the cellular level. We furthermore developed a computational framework for reconstructing long-term cell tracks, cell divisions, dynamic fate maps, and maps of tissue morphogenesis across the entire embryo. By jointly analyzing cellular dynamics in multiple embryos registered in space and time, we built a dynamic atlas of post-implantation mouse development that, together with our microscopy and computational methods, is provided as a resource.

    View Publication Page
    08/20/18 | Multiple animals tracking in video using part affinity fields
    Rodriguez IF, Megret R, Egnor R, Branson K, Agosto JL, Giray T, Acuna E
    Visual observation and analysis of Vertebrate And Insect Behavior 2018. 2018 Aug 20:

    In this work, we address the problem of pose detection and tracking of multiple individuals for the study of behaviour in insects and animals. Using a Deep Neural Network architecture, precise detection and association of the body parts can be performed. The models are learned based on user-annotated training videos, which gives flexibility to the approach. This is illustrated on two different animals: honeybees and mice, where very good performance in part recognition and association are observed despite the presence of multiple interacting individuals.

    View Publication Page
    06/26/18 | Honeybee detection and pose estimation using convolutional neural networks.
    Rodriguez IF, Branson KM, Acuna E, Agosto-Rivera J, Giray T, Megret R
    RFIAP 2018. 2018 Jun 26:

    The ability to automatize the analysis of video for monitoring animals and insects is of great interest for behavior science and ecology [1]. In particular, honeybees play a crucial role in agriculture as natural pollinators. However, recent studies has shown that phenomena such as colony collapse disorder are causing the loss of many colonies [2]. Due to the high number of interacting factors to explain these events, a multi-faceted analysis of the bees in their environment is required. We focus in our work in developing tools to help model and understand their behavior as individuals, in relation with the health and performance of the colony.

    In this paper, we report the development of a new system for the detection, locali- zation and tracking of honeybee body parts from video on the entrance ramp of the colony. The proposed system builds on the recent advances in Convolutional Neu- ral Networks (CNN) for Human pose estimation and evaluates the suitability for the detection of honeybee pose as shown in Figure 1. This opens the door for novel animal behavior analysis systems that take advantage of the precise detection and tracking of the insect pose. 

    View Publication Page