Filter
Associated Lab
- Aso Lab (1) Apply Aso Lab filter
- Card Lab (1) Apply Card Lab filter
- Dickson Lab (2) Apply Dickson Lab filter
- Harris Lab (1) Apply Harris Lab filter
- Hess Lab (1) Apply Hess Lab filter
- Keleman Lab (2) Apply Keleman Lab filter
- Reiser Lab (5) Apply Reiser Lab filter
- Romani Lab (1) Apply Romani Lab filter
- Rubin Lab (4) Apply Rubin Lab filter
- Saalfeld Lab (1) Apply Saalfeld Lab filter
- Scheffer Lab (1) Apply Scheffer Lab filter
- Singer Lab (1) Apply Singer Lab filter
- Stern Lab (1) Apply Stern Lab filter
- Truman Lab (1) Apply Truman Lab filter
- Turner Lab (1) Apply Turner Lab filter
Associated Project Team
Associated Support Team
- Fly Facility (6) Apply Fly Facility filter
- Janelia Experimental Technology (2) Apply Janelia Experimental Technology filter
- Molecular Genomics (2) Apply Molecular Genomics filter
- Project Technical Resources (4) Apply Project Technical Resources filter
- Scientific Computing Software (1) Apply Scientific Computing Software filter
- Scientific Computing Systems (1) Apply Scientific Computing Systems filter
14 Janelia Publications
Showing 11-14 of 14 resultsMulti-modal image registration is a challenging task that is vital to fuse complementary signals for subsequent analyses. Despite much research into cost functions addressing this challenge, there exist cases in which these are ineffective. In this work, we show that (1) this is true for the registration of in-vivo Drosophila brain volumes visualizing genetically encoded calcium indicators to an nc82 atlas and (2) that machine learning based contrast synthesis can yield improvements. More specifically, the number of subjects for which the registration outright failed was greatly reduced (from 40% to 15%) by using a synthesized image.
To survive, animals must convert sensory information into appropriate behaviours. Vision is a common sense for locating ethologically relevant stimuli and guiding motor responses. How circuitry converts object location in retinal coordinates to movement direction in body coordinates remains largely unknown. Here we show through behaviour, physiology, anatomy and connectomics in Drosophila that visuomotor transformation occurs by conversion of topographic maps formed by the dendrites of feature-detecting visual projection neurons (VPNs) into synaptic weight gradients of VPN outputs onto central brain neurons. We demonstrate how this gradient motif transforms the anteroposterior location of a visual looming stimulus into the fly's directional escape. Specifically, we discover that two neurons postsynaptic to a looming-responsive VPN type promote opposite takeoff directions. Opposite synaptic weight gradients onto these neurons from looming VPNs in different visual field regions convert localized looming threats into correctly oriented escapes. For a second looming-responsive VPN type, we demonstrate graded responses along the dorsoventral axis. We show that this synaptic gradient motif generalizes across all 20 primary VPN cell types and most often arises without VPN axon topography. Synaptic gradients may thus be a general mechanism for conveying spatial features of sensory information into directed motor outputs.
Color and polarization provide complementary information about the world and are detected by specialized photoreceptors. However, the downstream neural circuits that process these distinct modalities are incompletely understood in any animal. Using electron microscopy, we have systematically reconstructed the synaptic targets of the photoreceptors specialized to detect color and skylight polarization in Drosophila, and we have used light microscopy to confirm many of our findings. We identified known and novel downstream targets that are selective for different wavelengths or polarized light, and followed their projections to other areas in the optic lobes and the central brain. Our results revealed many synapses along the photoreceptor axons between brain regions, new pathways in the optic lobes, and spatially segregated projections to central brain regions. Strikingly, photoreceptors in the polarization-sensitive dorsal rim area target fewer cell types, and lack strong connections to the lobula, a neuropil involved in color processing. Our reconstruction identifies shared wiring and modality-specific specializations for color and polarization vision, and provides a comprehensive view of the first steps of the pathways processing color and polarized light inputs.
The perception of visual motion is critical for animal navigation, and flies are a prominent model system for exploring this neural computation. In Drosophila, the T4 cells of the medulla are directionally selective and necessary for ON motion behavioral responses. To examine the emergence of directional selectivity, we developed genetic driver lines for the neuron types with the most synapses onto T4 cells. Using calcium imaging, we found that these neuron types are not directionally selective and that selectivity arises in the T4 dendrites. By silencing each input neuron type, we identified which neurons are necessary for T4 directional selectivity and ON motion behavioral responses. We then determined the sign of the connections between these neurons and T4 cells using neuronal photoactivation. Our results indicate a computational architecture for motion detection that is a hybrid of classic theoretical models.