Filter
Associated Lab
- Aso Lab (3) Apply Aso Lab filter
- Betzig Lab (1) Apply Betzig Lab filter
- Bock Lab (2) Apply Bock Lab filter
- Branson Lab (3) Apply Branson Lab filter
- Card Lab (1) Apply Card Lab filter
- Cardona Lab (4) Apply Cardona Lab filter
- Cui Lab (1) Apply Cui Lab filter
- Dickson Lab (1) Apply Dickson Lab filter
- Druckmann Lab (2) Apply Druckmann Lab filter
- Eddy/Rivas Lab (2) Apply Eddy/Rivas Lab filter
- Egnor Lab (1) Apply Egnor Lab filter
- Fetter Lab (3) Apply Fetter Lab filter
- Freeman Lab (1) Apply Freeman Lab filter
- Funke Lab (1) Apply Funke Lab filter
- Gonen Lab (1) Apply Gonen Lab filter
- Harris Lab (3) Apply Harris Lab filter
- Heberlein Lab (1) Apply Heberlein Lab filter
- Hess Lab (5) Apply Hess Lab filter
- Jayaraman Lab (3) Apply Jayaraman Lab filter
- Keller Lab (1) Apply Keller Lab filter
- Lavis Lab (1) Apply Lavis Lab filter
- Lee (Albert) Lab (1) Apply Lee (Albert) Lab filter
- Li Lab (1) Apply Li Lab filter
- Lippincott-Schwartz Lab (1) Apply Lippincott-Schwartz Lab filter
- Looger Lab (1) Apply Looger Lab filter
- Magee Lab (1) Apply Magee Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Reiser Lab (1) Apply Reiser Lab filter
- Rubin Lab (9) Apply Rubin Lab filter
- Saalfeld Lab (6) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (7) Apply Scheffer Lab filter
- Singer Lab (1) Apply Singer Lab filter
- Spruston Lab (2) Apply Spruston Lab filter
- Stern Lab (2) Apply Stern Lab filter
- Sternson Lab (1) Apply Sternson Lab filter
- Svoboda Lab (2) Apply Svoboda Lab filter
- Truman Lab (1) Apply Truman Lab filter
- Turaga Lab (1) Apply Turaga Lab filter
- Turner Lab (2) Apply Turner Lab filter
- Zlatic Lab (1) Apply Zlatic Lab filter
Associated Project Team
Associated Support Team
- Anatomy and Histology (1) Apply Anatomy and Histology filter
- Fly Facility (3) Apply Fly Facility filter
- Gene Targeting and Transgenics (1) Apply Gene Targeting and Transgenics filter
- Janelia Experimental Technology (3) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Project Technical Resources (5) Apply Project Technical Resources filter
- Quantitative Genomics (1) Apply Quantitative Genomics filter
- Remove Scientific Computing Software filter Scientific Computing Software
- Scientific Computing Systems (1) Apply Scientific Computing Systems filter
Publication Date
56 Janelia Publications
Showing 1-10 of 56 resultsAnimals are often bombarded with visual information and must prioritize specific visual features based on their current needs. The neuronal circuits that detect and relay visual features have been well-studied. Yet, much less is known about how an animal adjusts its visual attention as its goals or environmental conditions change. During social behaviors, flies need to focus on nearby flies. Here, we study how the flow of visual information is altered when female Drosophila enter an aggressive state. From the connectome, we identified three state-dependent circuit motifs poised to selectively amplify the response of an aggressive female to fly-sized visual objects: convergence of excitatory inputs from neurons conveying select visual features and internal state; dendritic disinhibition of select visual feature detectors; and a switch that toggles between two visual feature detectors. Using cell-type-specific genetic tools, together with behavioral and neurophysiological analyses, we show that each of these circuit motifs function during female aggression. We reveal that features of this same switch operate in males during courtship pursuit, suggesting that disparate social behaviors may share circuit mechanisms. Our work provides a compelling example of using the connectome to infer circuit mechanisms that underlie dynamic processing of sensory signals.Competing Interest StatementThe authors have declared no competing interest.
Neuroscience research in Drosophila is benefiting from large-scale connectomics efforts using electron microscopy (EM) to reveal all the neurons in a brain and their connections. To exploit this knowledge base, researchers relate a connectome’s structure to neuronal function, often by studying individual neuron cell types. Vast libraries of fly driver lines expressing fluorescent reporter genes in sets of neurons have been created and imaged using confocal light microscopy (LM), enabling the targeting of neurons for experimentation. However, creating a fly line for driving gene expression within a single neuron found in an EM connectome remains a challenge, as it typically requires identifying a pair of driver lines where only the neuron of interest is expressed in both. This task and other emerging scientific workflows require finding similar neurons across large data sets imaged using different modalities. Here, we present NeuronBridge, a web application for easily and rapidly finding putative morphological matches between large data sets of neurons imaged using different modalities. We describe the functionality and construction of the NeuronBridge service, including its user-friendly graphical user interface (GUI), extensible data model, serverless cloud architecture, and massively parallel image search engine. NeuronBridge fills a critical gap in the Drosophila research workflow and is used by hundreds of neuroscience researchers around the world. We offer our software code, open APIs, and processed data sets for integration and reuse, and provide the application as a service at http://neuronbridge.janelia.org.Background
Results
Conclusions
Understanding the cell-type composition and spatial organization of brain regions is crucial for interpreting brain computation and function. In the thalamus, the anterior thalamic nuclei (ATN) are involved in a wide variety of functions, yet the cell-type composition of the ATN remains unmapped at a single-cell and spatial resolution. Combining single-cell RNA sequencing, spatial transcriptomics, and multiplexed fluorescent in situ hybridization, we identify three discrete excitatory cell-type clusters that correspond to the known nuclei of the ATN and uncover marker genes, molecular pathways, and putative functions of these cell types. We further illustrate graded spatial variation along the dorsomedial-ventrolateral axis for all individual nuclei of the ATN and additionally demonstrate that the anteroventral nucleus exhibits spatially covarying protein products and long-range inputs. Collectively, our study reveals discrete and continuous cell-type organizational principles of the ATN, which will help to guide and interpret experiments on ATN computation and function.
How memories are used by the brain to guide future action is poorly understood. In olfactory associative learning in Drosophila, multiple compartments of the mushroom body act in parallel to assign valence to a stimulus. Here, we show that appetitive memories stored in different compartments induce different levels of upwind locomotion. Using a photoactivation screen of a new collection of split-GAL4 drivers and EM connectomics, we identified a cluster of neurons postsynaptic to the mushroom body output neurons (MBONs) that can trigger robust upwind steering. These UpWind Neurons (UpWiNs) integrate inhibitory and excitatory synaptic inputs from MBONs of appetitive and aversive memory compartments, respectively. After training, disinhibition from the appetitive-memory MBONs enhances the response of UpWiNs to reward-predicting odors. Blocking UpWiNs impaired appetitive memory and reduced upwind locomotion during retrieval. Photoactivation of UpWiNs also increased the chance of returning to a location where activation was initiated, suggesting an additional role in olfactory navigation. Thus, our results provide insight into how learned abstract valences are gradually transformed into concrete memory-driven actions through divergent and convergent networks, a neuronal architecture that is commonly found in the vertebrate and invertebrate brains.
The mushroom body (MB) is the center for associative learning in insects. In Drosophila, intersectional split-GAL4 drivers and electron microscopy (EM) connectomes have laid the foundation for precise interrogation of the MB neural circuits. However, many cell types upstream and downstream of the MB remained to be investigated due to lack of driver lines. Here we describe a new collection of over 800 split-GAL4 and split-LexA drivers that cover approximately 300 cell types, including sugar sensory neurons, putative nociceptive ascending neurons, olfactory and thermo-/hygro-sensory projection neurons, interneurons connected with the MB-extrinsic neurons, and various other cell types. We characterized activation phenotypes for a subset of these lines and identified the sugar sensory neuron line most suitable for reward substitution. Leveraging the thousands of confocal microscopy images associated with the collection, we analyzed neuronal morphological stereotypy and discovered that one set of mushroom body output neurons, MBON08/MBON09, exhibits striking individuality and asymmetry across animals. In conjunction with the EM connectome maps, the driver lines reported here offer a powerful resource for functional dissection of neural circuits for associative learning in adult Drosophila.
Proteins localized at the cellular interface mediate cell-cell communication and thus control many aspects of physiology in multicellular organisms. Cell-surface proteomics allows biologists to comprehensively identify proteins on the cell surface and survey their dynamics in physiological and pathological conditions. PEELing provides an integrated package and user-centric web service for analyzing cell-surface proteomics data. With a streamlined and automated workflow, PEELing evaluates data quality using curated references, performs cutoff analysis to remove contaminants, connects to databases for functional annotation, and generates data visualizations. Together with chemical and transgenic tools, PEELing completes a pipeline making cell-surface proteomics analysis handy for every lab.
Cells contain hundreds of organelles and macromolecular assemblies. Obtaining a complete understanding of their intricate organization requires the nanometre-level, three-dimensional reconstruction of whole cells, which is only feasible with robust and scalable automatic methods. Here, to support the development of such methods, we annotated up to 35 different cellular organelle classes-ranging from endoplasmic reticulum to microtubules to ribosomes-in diverse sample volumes from multiple cell types imaged at a near-isotropic resolution of 4 nm per voxel with focused ion beam scanning electron microscopy (FIB-SEM). We trained deep learning architectures to segment these structures in 4 nm and 8 nm per voxel FIB-SEM volumes, validated their performance and showed that automatic reconstructions can be used to directly quantify previously inaccessible metrics including spatial interactions between cellular components. We also show that such reconstructions can be used to automatically register light and electron microscopy images for correlative studies. We have created an open data and open-source web repository, 'OpenOrganelle', to share the data, computer code and trained models, which will enable scientists everywhere to query and further improve automatic reconstruction of these datasets.
The neural circuits responsible for animal behavior remain largely unknown. We summarize new methods and present the circuitry of a large fraction of the brain of the fruit fly . Improved methods include new procedures to prepare, image, align, segment, find synapses in, and proofread such large data sets. We define cell types, refine computational compartments, and provide an exhaustive atlas of cell examples and types, many of them novel. We provide detailed circuits consisting of neurons and their chemical synapses for most of the central brain. We make the data public and simplify access, reducing the effort needed to answer circuit questions, and provide procedures linking the neurons defined by our analysis with genetic reagents. Biologically, we examine distributions of connection strengths, neural motifs on different scales, electrical consequences of compartmentalization, and evidence that maximizing packing density is an important criterion in the evolution of the fly's brain.
Animals avoid predators and find the best food and mates by learning from the consequences of their behavior. However, reinforcers are not always uniquely appetitive or aversive but can have complex properties. Most intoxicating substances fall within this category; provoking aversive sensory and physiological reactions while simultaneously inducing overwhelming appetitive properties. Here we describe the subtle behavioral features associated with continued seeking for alcohol despite aversive consequences. We developed an automated runway apparatus to measure how Drosophila respond to consecutive exposures of a volatilized substance. Behavior within this Behavioral Expression of Ethanol Reinforcement Runway (BEER Run) demonstrated a defined shift from aversive to appetitive responses to volatilized ethanol. Behavioral metrics attained by combining computer vision and machine learning methods, reveal that a subset of 9 classified behaviors and component behavioral features associate with this shift. We propose this combination of 9 be
Neurons perform computations by integrating inputs from thousands of synapses-mostly in the dendritic tree-to drive action potential firing in the axon. One fruitful approach to studying this process is to record from neurons using patch-clamp electrodes, fill the recorded neurons with a substance that allows subsequent staining, reconstruct the three-dimensional architectures of the dendrites, and use the resulting functional and structural data to develop computer models of dendritic integration. Accurately producing quantitative reconstructions of dendrites is typically a tedious process taking many hours of manual inspection and measurement. Here we present ShuTu, a new software package that facilitates accurate and efficient reconstruction of dendrites imaged using bright-field microscopy. The program operates in two steps: (1) automated identification of dendritic processes, and (2) manual correction of errors in the automated reconstruction. This approach allows neurons with complex dendritic morphologies to be reconstructed rapidly and efficiently, thus facilitating the use of computer models to study dendritic structure-function relationships and the computations performed by single neurons.