Filter
Associated Lab
- Aguilera Castrejon Lab (1) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (53) Apply Ahrens Lab filter
- Aso Lab (40) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (101) Apply Betzig Lab filter
- Beyene Lab (8) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (50) Apply Branson Lab filter
- Card Lab (36) Apply Card Lab filter
- Cardona Lab (45) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (14) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (38) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (15) Apply Espinosa Medina Lab filter
- Feliciano Lab (7) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- Fitzgerald Lab (16) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (38) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (53) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (22) Apply Hermundstad Lab filter
- Hess Lab (74) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (42) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (61) Apply Keller Lab filter
- Koay Lab (2) Apply Koay Lab filter
- Lavis Lab (137) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (4) Apply Li Lab filter
- Lippincott-Schwartz Lab (97) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (1) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (58) Apply Liu (Zhe) Lab filter
- Looger Lab (137) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (6) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (36) Apply Pachitariu Lab filter
- Pastalkova Lab (5) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (4) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (45) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (31) Apply Romani Lab filter
- Rubin Lab (105) Apply Rubin Lab filter
- Saalfeld Lab (46) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (50) Apply Schreiter Lab filter
- Sgro Lab (1) Apply Sgro Lab filter
- Shroff Lab (31) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (57) Apply Spruston Lab filter
- Stern Lab (73) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (32) Apply Stringer Lab filter
- Svoboda Lab (131) Apply Svoboda Lab filter
- Tebo Lab (9) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (18) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (39) Apply Turaga Lab filter
- Turner Lab (27) Apply Turner Lab filter
- Vale Lab (7) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (21) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (6) Apply Wang (Shaohe) Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (3) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (11) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (53) Apply FlyEM filter
- FlyLight (49) Apply FlyLight filter
- GENIE (46) Apply GENIE filter
- Integrative Imaging (4) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (26) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Project Pipeline Support (5) Apply Project Pipeline Support filter
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (35) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (16) Apply Electron Microscopy filter
- Gene Targeting and Transgenics (11) Apply Gene Targeting and Transgenics filter
- Integrative Imaging (17) Apply Integrative Imaging filter
- Invertebrate Shared Resource (40) Apply Invertebrate Shared Resource filter
- Janelia Experimental Technology (37) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Primary & iPS Cell Culture (14) Apply Primary & iPS Cell Culture filter
- Project Technical Resources (50) Apply Project Technical Resources filter
- Quantitative Genomics (19) Apply Quantitative Genomics filter
- Scientific Computing Software (92) Apply Scientific Computing Software filter
- Scientific Computing Systems (7) Apply Scientific Computing Systems filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (7) Apply Vivarium filter
Publication Date
- 2025 (124) Apply 2025 filter
- 2024 (215) Apply 2024 filter
- 2023 (159) Apply 2023 filter
- 2022 (167) Apply 2022 filter
- 2021 (175) Apply 2021 filter
- 2020 (177) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2689 Janelia Publications
Showing 671-680 of 2689 resultsIn vivo imaging at high spatiotemporal resolution is key to the understanding of complex biological systems. We integrated an optical phase-locked ultrasound lens into a two-photon fluorescence microscope and achieved microsecond-scale axial scanning, thus enabling volumetric imaging at tens of hertz. We applied this system to multicolor volumetric imaging of processes sensitive to motion artifacts, including calcium dynamics in behaving mouse brain and transient morphology changes and trafficking of immune cells.
Mechano-transduction is an emerging but still poorly understood component of T cell activation. Here we investigated the ligand-dependent contribution made by contractile actomyosin arcs populating the peripheral supramolecular activation cluster (pSMAC) region of the immunological synapse (IS) to T cell receptor (TCR) microcluster transport and proximal signaling in primary mouse T cells. Using super resolution microscopy, OT1-CD8+ mouse T cells, and two ovalbumin (OVA) peptides with different affinities for the TCR, we show that the generation of organized actomyosin arcs depends on ligand potency and the ability of myosin 2 to contract actin filaments. While weak ligands induce disorganized actomyosin arcs, strong ligands result in organized actomyosin arcs that correlate well with tension-sensitive CasL phosphorylation and the accumulation of ligands at the IS center. Blocking myosin 2 contractility greatly reduces the difference in the extent of Src and LAT phosphorylation observed between the strong and the weak ligand, arguing that myosin 2-dependent force generation within actin arcs contributes to ligand discrimination. Together, our data are consistent with the idea that actomyosin arcs in the pSMAC region of the IS promote a mechano-chemical feedback mechanism that amplifies the accumulation of critical signaling molecules at the IS.
Motion detection is a fundamental neural computation performed by many sensory systems. In the fly, local motion computation is thought to occur within the first two layers of the visual system, the lamina and medulla. We constructed specific genetic driver lines for each of the 12 neuron classes in the lamina. We then depolarized and hyperpolarized each neuron type and quantified fly behavioral responses to a diverse set of motion stimuli. We found that only a small number of lamina output neurons are essential for motion detection, while most neurons serve to sculpt and enhance these feedforward pathways. Two classes of feedback neurons (C2 and C3), and lamina output neurons (L2 and L4), are required for normal detection of directional motion stimuli. Our results reveal a prominent role for feedback and lateral interactions in motion processing and demonstrate that motion-dependent behaviors rely on contributions from nearly all lamina neuron classes.
The Drosophila mushroom body (MB) is an associative learning network that is important for the control of sleep. We have recently identified particular intrinsic MB Kenyon cell (KC) classes that regulate sleep through synaptic activation of particular MB output neurons (MBONs) whose axons convey sleep control signals out of the MB to downstream target regions. Specifically, we found that sleep-promoting KCs increase sleep by preferentially activating cholinergic sleep-promoting MBONs, while wake-promoting KCs decrease sleep by preferentially activating glutamatergic wake-promoting MBONs. Here we use a combination of genetic and physiological approaches to identify wake-promoting dopaminergic neurons (DANs) that innervate the MB, and show that they activate wake-promoting MBONs. These studies reveal a dopaminergic sleep control mechanism that likely operates by modulation of KC-MBON microcircuits.
A consortium of inhibitory neurons control the firing patterns of pyramidal cells, but their specific roles in the behaving animal are largely unknown. We performed simultaneous physiological recordings and optogenetic silencing of either perisomatic (parvalbumin (PV) expressing) or dendrite-targeting (somatostatin (SOM) expressing) interneurons in hippocampal area CA1 of head-fixed mice actively moving a treadmill belt rich with visual-tactile stimuli. Silencing of either PV or SOM interneurons increased the firing rates of pyramidal cells selectively in their place fields, with PV and SOM interneurons having their largest effect during the rising and decaying parts of the place field, respectively. SOM interneuron silencing powerfully increased burst firing without altering the theta phase of spikes. In contrast, PV interneuron silencing had no effect on burst firing, but instead shifted the spikes’ theta phase toward the trough of theta. These findings indicate that perisomatic and dendritic inhibition have distinct roles in controlling the rate, burst and timing of hippocampal pyramidal cells.
View Publication PageWe describe the anatomy of all the primary motor neurons in the fly proboscis and characterize their contributions to its diverse reaching movements. Pairing this behavior with the wealth of genetic tools offers the possibility to study motor control at single-neuron resolution, and soon throughout entire circuits. As an entry to these circuits, we provide detailed anatomy of proboscis motor neurons, muscles, and joints. We create a collection of fly strains to individually manipulate every proboscis muscle through control of its motor neurons, the first such collection for an appendage. We generate a model of the action of each proboscis joint, and find that only a small number of motor neurons are needed to produce proboscis reaching. Comprehensive control of each motor element in this numerically simple system paves the way for future study of both reflexive and flexible movements of this appendage.
We reconstructed, from a whole CNS EM volume, the synaptic map of input and output neurons that underlie food intake behavior of larvae. Input neurons originate from enteric, pharyngeal and external sensory organs and converge onto seven distinct sensory synaptic compartments within the CNS. Output neurons consist of feeding motor, serotonergic modulatory and neuroendocrine neurons. Monosynaptic connections from a set of sensory synaptic compartments cover the motor, modulatory and neuroendocrine targets in overlapping domains. Polysynaptic routes are superimposed on top of monosynaptic connections, resulting in divergent sensory paths that converge on common outputs. A completely different set of sensory compartments is connected to the mushroom body calyx. The mushroom body output neurons are connected to interneurons that directly target the feeding output neurons. Our results illustrate a circuit architecture in which monosynaptic and multisynaptic connections from sensory inputs traverse onto output neurons via a series of converging paths.
Cerebellar granule cells constitute the majority of neurons in the brain and are the primary conveyors of sensory and motor-related mossy fiber information to Purkinje cells. The functional capability of the cerebellum hinges on whether individual granule cells receive mossy fiber inputs from multiple precerebellar nuclei or are instead unimodal; this distinction is unresolved. Using cell-type-specific projection mapping with synaptic resolution, we observed the convergence of separate sensory (upper body proprioceptive) and basilar pontine pathways onto individual granule cells and mapped this convergence across cerebellar cortex. These findings inform the long-standing debate about the multimodality of mammalian granule cells and substantiate their associative capacity predicted in the Marr-Albus theory of cerebellar function. We also provide evidence that the convergent basilar pontine pathways carry corollary discharges from upper body motor cortical areas. Such merging of related corollary and sensory streams is a critical component of circuit models of predictive motor control. DOI:http://dx.doi.org/10.7554/eLife.00400.001.
Deep neural networks have been applied to improve the image quality of fluorescence microscopy imaging. Previous methods are based on convolutional neural networks (CNNs) which generally require more time-consuming training of separate models for each new imaging experiment, impairing the applicability and generalization. Once the model is trained (typically with tens to hundreds of image pairs) it can then be used to enhance new images that are like the training data. In this study, we proposed a novel imaging-transformer based model, Convolutional Neural Network Transformer (CNNT), to outperform the CNN networks for image denoising. In our scheme we have trained a single CNNT based backbone model from pairwise high-low SNR images for one type of fluorescence microscope (instance structured illumination, iSim). Fast adaption to new applications was achieved by fine-tuning the backbone on only 5-10 sample pairs per new experiment. Results show the CNNT backbone and fine-tuning scheme significantly reduces the training time and improves the image quality, outperformed training separate models using CNN approaches such as - RCAN and Noise2Fast. Here we show three examples of the efficacy of this approach on denoising wide-field, two-photon and confocal fluorescence data. In the confocal experiment, which is a 5 by 5 tiled acquisition, the fine-tuned CNNT model reduces the scan time form one hour to eight minutes, with improved quality.
Deep neural networks can improve the quality of fluorescence microscopy images. Previous methods, based on Convolutional Neural Networks (CNNs), require time-consuming training of individual models for each experiment, impairing their applicability and generalization. In this study, we propose a novel imaging-transformer based model, Convolutional Neural Network Transformer (CNNT), that outperforms CNN based networks for image denoising. We train a general CNNT based backbone model from pairwise high-low Signal-to-Noise Ratio (SNR) image volumes, gathered from a single type of fluorescence microscope, an instant Structured Illumination Microscope. Fast adaptation to new microscopes is achieved by fine-tuning the backbone on only 5-10 image volume pairs per new experiment. Results show that the CNNT backbone and fine-tuning scheme significantly reduces training time and improves image quality, outperforming models trained using only CNNs such as 3D-RCAN and Noise2Fast. We show three examples of efficacy of this approach in wide-field, two-photon, and confocal fluorescence microscopy.