Filter
Associated Lab
- 43430 (2) Apply 43430 filter
- Aso Lab (11) Apply Aso Lab filter
- Bock Lab (3) Apply Bock Lab filter
- Branson Lab (2) Apply Branson Lab filter
- Card Lab (5) Apply Card Lab filter
- Cardona Lab (1) Apply Cardona Lab filter
- Dickson Lab (5) Apply Dickson Lab filter
- Funke Lab (1) Apply Funke Lab filter
- Harris Lab (1) Apply Harris Lab filter
- Lavis Lab (1) Apply Lavis Lab filter
- Reiser Lab (5) Apply Reiser Lab filter
- Rubin Lab (15) Apply Rubin Lab filter
- Saalfeld Lab (2) Apply Saalfeld Lab filter
- Scheffer Lab (1) Apply Scheffer Lab filter
- Simpson Lab (1) Apply Simpson Lab filter
- Singer Lab (1) Apply Singer Lab filter
- Stern Lab (1) Apply Stern Lab filter
- Tillberg Lab (1) Apply Tillberg Lab filter
- Truman Lab (3) Apply Truman Lab filter
- Zlatic Lab (1) Apply Zlatic Lab filter
Associated Project Team
Publication Date
Type of Publication
35 Publications
Showing 1-10 of 35 resultsKnowledge of one’s own behavioral state—whether one is walking, grooming, or resting—is critical for contextualizing sensory cues including interpreting visual motion and tracking odor sources. Additionally, awareness of one’s own posture is important to avoid initiating destabilizing or physically impossible actions. Ascending neurons (ANs), interneurons in the vertebrate spinal cord or insect ventral nerve cord (VNC) that project to the brain, may provide such high-fidelity behavioral state signals. However, little is known about what ANs encode and where they convey signals in any brain. To address this gap, we performed a large-scale functional screen of AN movement encoding, brain targeting, and motor system patterning in the adult fly, Drosophila melanogaster. Using a new library of AN sparse driver lines, we measured the functional properties of 247 genetically-identifiable ANs by performing two-photon microscopy recordings of neural activity in tethered, behaving flies. Quantitative, deep network-based neural and behavioral analyses revealed that ANs nearly exclusively encode high-level behaviors—primarily walking as well as resting and grooming—rather than low-level joint or limb movements. ANs that convey self-motion—resting, walking, and responses to gust-like puff stimuli—project to the brain’s anterior ventrolateral protocerebrum (AVLP), a multimodal, integrative sensory hub, while those that encode discrete actions—eye grooming, turning, and proboscis extension—project to the brain’s gnathal ganglion (GNG), a locus for action selection. The structure and polarity of AN projections within the VNC are predictive of their functional encoding and imply that ANs participate in motor computations while also relaying state signals to the brain. Illustrative of this are ANs that temporally integrate proboscis extensions over tens-of-seconds, likely through recurrent interconnectivity. Thus, in line with long-held theoretical predictions, ascending populations convey high-level behavioral state signals almost exclusively to brain regions implicated in sensory feature contextualization and action selection.
Precise, repeatable genetic access to specific neurons via GAL4/UAS and related methods is a key advantage of Drosophila neuroscience. Neuronal targeting is typically documented using light microscopy of full GAL4 expression patterns, which generally lack the single-cell resolution required for reliable cell type identification. Here we use stochastic GAL4 labeling with the MultiColor FlpOut approach to generate cellular resolution confocal images at large scale. We are releasing aligned images of 74,000 such adult central nervous systems. An anticipated use of this resource is to bridge the gap between neurons identified by electron or light microscopy. Identifying individual neurons that make up each GAL4 expression pattern improves the prediction of split-GAL4 combinations targeting particular neurons. To this end we have made the images searchable on the NeuronBridge website. We demonstrate the potential of NeuronBridge to rapidly and effectively identify neuron matches based on morphology across imaging modalities and datasets.
Dopaminergic neurons with distinct projection patterns and physiological properties compose memory subsystems in a brain. However, it is poorly understood whether or how they interact during complex learning. Here, we identify a feedforward circuit formed between dopamine subsystems and show that it is essential for second-order conditioning, an ethologically important form of higher-order associative learning. The Drosophila mushroom body comprises a series of dopaminergic compartments, each of which exhibits distinct memory dynamics. We find that a slow and stable memory compartment can serve as an effective “teacher” by instructing other faster and transient memory compartments via a single key interneuron, which we identify by connectome analysis and neurotransmitter prediction. This excitatory interneuron acquires enhanced response to reward-predicting odor after first-order conditioning and, upon activation, evokes dopamine release in the “student” compartments. These hierarchical connections between dopamine subsystems explain distinct properties of first- and second-order memory long known by behavioral psychologists.
Many animals rely on vision to navigate through their environment. The pattern of changes in the visual scene induced by self-motion is the optic flow1, which is first estimated in local patches by directionally selective (DS) neurons2–4. But how should the arrays of DS neurons, each responsive to motion in a preferred direction at a specific retinal position, be organized to support robust decoding of optic flow by downstream circuits? Understanding this global organization is challenging because it requires mapping fine, local features of neurons across the animal’s field of view3. In Drosophila, the asymmetric dendrites of the T4 and T5 DS neurons establish their preferred direction, making it possible to predict DS responses from anatomy4,5. Here we report that the preferred directions of fly DS neurons vary at different retinal positions and show that this spatial variation is established by the anatomy of the compound eye. To estimate the preferred directions across the visual field, we reconstructed hundreds of T4 neurons in a full brain EM volume6 and discovered unexpectedly stereotypical dendritic arborizations that are independent of location. We then used whole-head μCT scans to map the viewing directions of all compound eye facets and found a non-uniform sampling of visual space that explains the spatial variation in preferred directions. Our findings show that the organization of preferred directions in the fly is largely determined by the compound eye, exposing an intimate and unexpected connection between the peripheral structure of the eye, functional properties of neurons deep in the brain, and the control of body movements.
How memories are used by the brain to guide future action is poorly understood. In olfactory associative learning in Drosophila, multiple compartments of the mushroom body act in parallel to assign valence to a stimulus. Here, we show that appetitive memories stored in different compartments induce different levels of upwind locomotion. Using a photoactivation screen of a new collection of split-GAL4 drivers and EM connectomics, we identified a cluster of neurons postsynaptic to the mushroom body output neurons (MBONs) that can trigger robust upwind steering. These UpWind Neurons (UpWiNs) integrate inhibitory and excitatory synaptic inputs from MBONs of appetitive and aversive memory compartments, respectively. After training, disinhibition from the appetitive-memory MBONs enhances the response of UpWiNs to reward-predicting odors. Blocking UpWiNs impaired appetitive memory and reduced upwind locomotion during retrieval. Photoactivation of UpWiNs also increased the chance of returning to a location where activation was initiated, suggesting an additional role in olfactory navigation. Thus, our results provide insight into how learned abstract valences are gradually transformed into concrete memory-driven actions through divergent and convergent networks, a neuronal architecture that is commonly found in the vertebrate and invertebrate brains.
Animals flexibly switch between different actions by changing neural activity patterns for motor control. Courting Drosophila melanogaster males produce two different acoustic signals, pulse and sine song, each of which can be promoted by artificial activation of distinct neurons. However, how the activity of these neurons implements flexible song production is unknown. Here, we developed an assay to record neuronal calcium signals in the ventral nerve cord, which contains the song motor circuit, in singing flies. We found that sine-promoting neurons, but not pulse-promoting neurons, show strong activation during sine song. In contrast, both pulse- and sine-promoting neurons are active during pulse song. Furthermore, population calcium imaging in the song circuit suggests that sine song involves activation of a subset of neurons that are also active during pulse song. Thus, differential activation of overlapping, rather than distinct, neural populations underlies flexible motor actions during acoustic communication in D. melanogaster.
Electron microscopy (EM) allows for the reconstruction of dense neuronal connectomes but suffers from low throughput, limiting its application to small numbers of reference specimens. We developed a protocol and analysis pipeline using tissue expansion and lattice light-sheet microscopy (ExLLSM) to rapidly reconstruct selected circuits across many samples with single synapse resolution and molecular contrast. We validate this approach in Drosophila, demonstrating that it yields synaptic counts similar to those obtained by EM, can be used to compare counts across sex and experience, and to correlate structural connectivity with functional connectivity. This approach fills a critical methodological gap in studying variability in the structure and function of neural circuits across individuals within and between species.
Neuroscience research in Drosophila is benefiting from large-scale connectomics efforts using electron microscopy (EM) to reveal all the neurons in a brain and their connections. In order to exploit this knowledge base, researchers target individual neurons and study their function. Therefore, vast libraries of fly driver lines expressing fluorescent reporter genes in sets of neurons have been created and imaged using confocal light microscopy (LM). However, creating a fly line for driving gene expression within a single neuron found in the EM connectome remains a challenge, as it typically requires identifying a pair of fly lines where only the neuron of interest is expressed in both. This task and other emerging scientific workflows require finding similar neurons across large data sets imaged using different modalities. Here, we present NeuronBridge, a web application for easily and rapidly finding putative morphological matches between large datasets of neurons imaged using different modalities. We describe the functionality and construction of the NeuronBridge service, including its user-friendly GUI, data model, serverless cloud architecture, and massively parallel image search engine. NeuronBridge is openly accessible at http://neuronbridge.janelia.org/.
Similar to many insect species, Drosophila melanogaster is capable of maintaining a stable flight trajectory for periods lasting up to several hours. Because aerodynamic torque is roughly proportional to the fifth power of wing length, even small asymmetries in wing size require the maintenance of subtle bilateral differences in flapping motion to maintain a stable path. Flies can even fly straight after losing half of a wing, a feat they accomplish via very large, sustained kinematic changes to both the damaged and intact wings. Thus, the neural network responsible for stable flight must be capable of sustaining fine-scaled control over wing motion across a large dynamic range. In this study, we describe an unusual type of descending neuron (DNg02) that projects directly from visual output regions of the brain to the dorsal flight neuropil of the ventral nerve cord. Unlike many descending neurons, which exist as single bilateral pairs with unique morphology, there is a population of at least 15 DNg02 cell pairs with nearly identical shape. By optogenetically activating different numbers of DNg02 cells, we demonstrate that these neurons regulate wingbeat amplitude over a wide dynamic range via a population code. Using two-photon functional imaging, we show that DNg02 cells are responsive to visual motion during flight in a manner that would make them well suited to continuously regulate bilateral changes in wing kinematics. Collectively, we have identified a critical set of descending neurons that provides the sensitivity and dynamic range required for flight control.
Color and polarization provide complementary information about the world and are detected by specialized photoreceptors. However, the downstream neural circuits that process these distinct modalities are incompletely understood in any animal. Using electron microscopy, we have systematically reconstructed the synaptic targets of the photoreceptors specialized to detect color and skylight polarization in Drosophila, and we have used light microscopy to confirm many of our findings. We identified known and novel downstream targets that are selective for different wavelengths or polarized light, and followed their projections to other areas in the optic lobes and the central brain. Our results revealed many synapses along the photoreceptor axons between brain regions, new pathways in the optic lobes, and spatially segregated projections to central brain regions. Strikingly, photoreceptors in the polarization-sensitive dorsal rim area target fewer cell types, and lack strong connections to the lobula, a neuropil involved in color processing. Our reconstruction identifies shared wiring and modality-specific specializations for color and polarization vision, and provides a comprehensive view of the first steps of the pathways processing color and polarized light inputs.