Main Menu (Mobile)- Block

Main Menu - Block

custom | custom

Search Results

filters_region_cap | custom

Filter

facetapi-Q2b17qCsTdECvJIqZJgYMaGsr8vANl1n | block
facetapi-W9JlIB1X0bjs93n1Alu3wHJQTTgDCBGe | block

Associated Project Team

facetapi-61yz1V0li8B1bixrCWxdAe2aYiEXdhd0 | block

Associated Support Team

facetapi-PV5lg7xuz68EAY8eakJzrcmwtdGEnxR0 | block
facetapi-aK0bSsPXQOqhYQEgonL2xGNrv4SPvFLb | block

Tool Types

general_search_page-panel_pane_1 | views_panes

3 Janelia Publications

Showing 1-3 of 3 results
Your Criteria:
    10/09/19 | Computational neuroethology: A call to action.
    Datta SR, Anderson DJ, Branson K, Perona P, Leifer A
    Neuron. 2019 Oct 09;104(1):11-24. doi: 10.1016/j.neuron.2019.09.038

    The brain is worthy of study because it is in charge of behavior. A flurry of recent technical advances in measuring and quantifying naturalistic behaviors provide an important opportunity for advancing brain science. However, the problem of understanding unrestrained behavior in the context of neural recordings and manipulations remains unsolved, and developing approaches to addressing this challenge is critical. Here we discuss considerations in computational neuroethology-the science of quantifying naturalistic behaviors for understanding the brain-and propose strategies to evaluate progress. We point to open questions that require resolution and call upon the broader systems neuroscience community to further develop and leverage measures of naturalistic, unrestrained behavior, which will enable us to more effectively probe the richness and complexity of the brain.

    View Publication Page
    08/12/19 | An automatic behavior recognition system classifies animal behaviors using movements and their temporal context.
    Ravbar P, Branson K, Simpson JH
    Journal of Neuroscience Methods. 2019 Aug 12;326:108352. doi: 10.1016/j.jneumeth.2019.108352

    Animals can perform complex and purposeful behaviors by executing simpler movements in flexible sequences. It is particularly challenging to analyze behavior sequences when they are highly variable, as is the case in language production, certain types of birdsong and, as in our experiments, flies grooming. High sequence variability necessitates rigorous quantification of large amounts of data to identify organizational principles and temporal structure of such behavior. To cope with large amounts of data, and minimize human effort and subjective bias, researchers often use automatic behavior recognition software. Our standard grooming assay involves coating flies in dust and videotaping them as they groom to remove it. The flies move freely and so perform the same movements in various orientations. As the dust is removed, their appearance changes. These conditions make it difficult to rely on precise body alignment and anatomical landmarks such as eyes or legs and thus present challenges to existing behavior classification software. Human observers use speed, location, and shape of the movements as the diagnostic features of particular grooming actions. We applied this intuition to design a new automatic behavior recognition system (ABRS) based on spatiotemporal features in the video data, heavily weighted for temporal dynamics and invariant to the animal’s position and orientation in the scene. We use these spatiotemporal features in two steps of supervised classification that reflect two time-scales at which the behavior is structured. As a proof of principle, we show results from quantification and analysis of a large data set of stimulus-induced fly grooming behaviors that would have been difficult to assess in a smaller dataset of human-annotated ethograms. While we developed and validated this approach to analyze fly grooming behavior, we propose that the strategy of combining alignment-invariant features and multi-timescale analysis may be generally useful for movement-based classification of behavior from video data.

    View Publication Page
    07/01/19 | State-dependent decoupling of sensory and motor circuits underlies behavioral flexibility in Drosophila.
    Ache JM, Namiki S, Lee A, Branson K, Card GM
    Nature Neuroscience. 2019 Jul 01;22(7):1132-1139. doi: 10.1038/s41593-019-0413-4

    An approaching predator and self-motion toward an object can generate similar looming patterns on the retina, but these situations demand different rapid responses. How central circuits flexibly process visual cues to activate appropriate, fast motor pathways remains unclear. Here we identify two descending neuron (DN) types that control landing and contribute to visuomotor flexibility in Drosophila. For each, silencing impairs visually evoked landing, activation drives landing, and spike rate determines leg extension amplitude. Critically, visual responses of both DNs are severely attenuated during non-flight periods, effectively decoupling visual stimuli from the landing motor pathway when landing is inappropriate. The flight-dependence mechanism differs between DN types. Octopamine exposure mimics flight effects in one, whereas the other probably receives neuronal feedback from flight motor circuits. Thus, this sensorimotor flexibility arises from distinct mechanisms for gating action-specific descending pathways, such that sensory and motor networks are coupled or decoupled according to the behavioral state.

    View Publication Page