Filter
Associated Lab
- Aguilera Castrejon Lab (1) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (52) Apply Ahrens Lab filter
- Aso Lab (40) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (100) Apply Betzig Lab filter
- Beyene Lab (8) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (49) Apply Branson Lab filter
- Card Lab (35) Apply Card Lab filter
- Cardona Lab (44) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (13) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (38) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (15) Apply Espinosa Medina Lab filter
- Feliciano Lab (7) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- Fitzgerald Lab (16) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (38) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (50) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (22) Apply Hermundstad Lab filter
- Hess Lab (73) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (42) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (61) Apply Keller Lab filter
- Koay Lab (2) Apply Koay Lab filter
- Lavis Lab (136) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (4) Apply Li Lab filter
- Lippincott-Schwartz Lab (95) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (1) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (56) Apply Liu (Zhe) Lab filter
- Looger Lab (137) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (5) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (35) Apply Pachitariu Lab filter
- Pastalkova Lab (5) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (4) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (45) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (31) Apply Romani Lab filter
- Rubin Lab (105) Apply Rubin Lab filter
- Saalfeld Lab (46) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (50) Apply Schreiter Lab filter
- Sgro Lab (1) Apply Sgro Lab filter
- Shroff Lab (30) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (57) Apply Spruston Lab filter
- Stern Lab (73) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (31) Apply Stringer Lab filter
- Svoboda Lab (131) Apply Svoboda Lab filter
- Tebo Lab (9) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (18) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (38) Apply Turaga Lab filter
- Turner Lab (26) Apply Turner Lab filter
- Vale Lab (7) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (18) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (6) Apply Wang (Shaohe) Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (2) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (10) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (53) Apply FlyEM filter
- FlyLight (49) Apply FlyLight filter
- GENIE (45) Apply GENIE filter
- Integrative Imaging (3) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (26) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Project Pipeline Support (3) Apply Project Pipeline Support filter
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (34) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (15) Apply Electron Microscopy filter
- Gene Targeting and Transgenics (11) Apply Gene Targeting and Transgenics filter
- Integrative Imaging (17) Apply Integrative Imaging filter
- Invertebrate Shared Resource (40) Apply Invertebrate Shared Resource filter
- Janelia Experimental Technology (37) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Primary & iPS Cell Culture (14) Apply Primary & iPS Cell Culture filter
- Project Technical Resources (47) Apply Project Technical Resources filter
- Quantitative Genomics (19) Apply Quantitative Genomics filter
- Scientific Computing Software (91) Apply Scientific Computing Software filter
- Scientific Computing Systems (6) Apply Scientific Computing Systems filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (7) Apply Vivarium filter
Publication Date
- 2025 (85) Apply 2025 filter
- 2024 (221) Apply 2024 filter
- 2023 (160) Apply 2023 filter
- 2022 (167) Apply 2022 filter
- 2021 (175) Apply 2021 filter
- 2020 (177) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2657 Janelia Publications
Showing 2511-2520 of 2657 resultsThe fruit fly (Drosophila melanogaster) is a commonly used model organism in biology. We are currently building a 3D digital atlas of the fruit fly larval nervous system (LNS) based on a large collection of fly larva GAL4 lines, each of which targets a subset of neurons. To achieve such a goal, we need to automatically align a number of high-resolution confocal image stacks of these GAL4 lines. One commonly employed strategy in image pattern registration is to first globally align images using an affine transform, followed by local non-linear warping. Unfortunately, the spatially articulated and often twisted LNS makes it difficult to globally align the images directly using the affine method. In a parallel project to build a 3D digital map of the adult fly ventral nerve cord (VNC), we are confronted with a similar problem.
The stabilization of new spines in the barrel cortex is enhanced after whisker trimming, but its relationship to experience-dependent plasticity is unclear. Here we show that in wild-type mice, whisker potentiation and spine stabilization are most pronounced for layer 5 neurons at the border between spared and deprived barrel columns. In homozygote alphaCaMKII-T286A mice, which lack experience-dependent potentiation of responses to spared whiskers, there is no increase in new spine stabilization at the border between barrel columns after whisker trimming. Our data provide a causal link between new spine synapses and plasticity of adult cortical circuits and suggest that alphaCaMKII autophosphorylation plays a role in the stabilization but not formation of new spines.
To elucidate the role of juvenile hormone (JH) in metamorphosis of Drosophila melanogaster, the corpora allata cells, which produce JH, were killed using the cell death gene grim. These allatectomized (CAX) larvae were smaller at pupariation and died at head eversion. They showed premature ecdysone receptor B1 (EcR-B1) in the photoreceptors and in the optic lobe, downregulation of proliferation in the optic lobe, and separation of R7 from R8 in the medulla during the prepupal period. All of these effects of allatectomy were reversed by feeding third instar larvae on a diet containing the JH mimic (JHM) pyriproxifen or by application of JH III or JHM at the onset of wandering. Eye and optic lobe development in the Methoprene-tolerant (Met)-null mutant mimicked that of CAX prepupae, but the mutant formed viable adults, which had marked abnormalities in the organization of their optic lobe neuropils. Feeding Met(27) larvae on the JHM diet did not rescue the premature EcR-B1 expression or the downregulation of proliferation but did partially rescue the premature separation of R7, suggesting that other pathways besides Met might be involved in mediating the response to JH. Selective expression of Met RNAi in the photoreceptors caused their premature expression of EcR-B1 and the separation of R7 and R8, but driving Met RNAi in lamina neurons led only to the precocious appearance of EcR-B1 in the lamina. Thus, the lack of JH and its receptor Met causes a heterochronic shift in the development of the visual system that is likely to result from some cells ’misinterpreting’ the ecdysteroid peaks that drive metamorphosis.
The V3D system provides three-dimensional (3D) visualization of gigabyte-sized microscopy image stacks in real time on current laptops and desktops. V3D streamlines the online analysis, measurement and proofreading of complicated image patterns by combining ergonomic functions for selecting a location in an image directly in 3D space and for displaying biological measurements, such as from fluorescent probes, using the overlaid surface objects. V3D runs on all major computer platforms and can be enhanced by software plug-ins to address specific biological problems. To demonstrate this extensibility, we built a V3D-based application, V3D-Neuron, to reconstruct complex 3D neuronal structures from high-resolution brain images. V3D-Neuron can precisely digitize the morphology of a single neuron in a fruitfly brain in minutes, with about a 17-fold improvement in reliability and tenfold savings in time compared with other neuron reconstruction tools. Using V3D-Neuron, we demonstrate the feasibility of building a 3D digital atlas of neurite tracts in the fruitfly brain.
Nearby neurons, sharing the same locations within the mouse whisker map, can have dramatically distinct response properties. To understand the significance of this diversity, we studied the relationship between the responses of individual neurons and their projection targets in the mouse barrel cortex. Neurons projecting to primary motor cortex (MI) or secondary somatosensory area (SII) were labeled with red fluorescent protein (RFP) using retrograde viral infection. We used in vivo two-photon Ca(2+) imaging to map the responses of RFP-positive and neighboring L2/3 neurons to whisker deflections. Neurons projecting to MI displayed larger receptive fields compared with other neurons, including those projecting to SII. Our findings support the view that intermingled neurons in primary sensory areas send specific stimulus features to different parts of the brain.
Linking activity in specific cell types with perception, cognition, and action, requires quantitative behavioral experiments in genetic model systems such as the mouse. In head-fixed primates, the combination of precise stimulus control, monitoring of motor output, and physiological recordings over large numbers of trials are the foundation on which many conceptually rich and quantitative studies have been built. Choice-based, quantitative behavioral paradigms for head-fixed mice have not been described previously. Here, we report a somatosensory absolute object localization task for head-fixed mice. Mice actively used their mystacial vibrissae (whiskers) to sense the location of a vertical pole presented to one side of the head and reported with licking whether the pole was in a target (go) or a distracter (no-go) location. Mice performed hundreds of trials with high performance (>90% correct) and localized to <0.95 mm (<6 degrees of azimuthal angle). Learning occurred over 1-2 weeks and was observed both within and across sessions. Mice could perform object localization with single whiskers. Silencing barrel cortex abolished performance to chance levels. We measured whisker movement and shape for thousands of trials. Mice moved their whiskers in a highly directed, asymmetric manner, focusing on the target location. Translation of the base of the whiskers along the face contributed substantially to whisker movements. Mice tended to maximize contact with the go (rewarded) stimulus while minimizing contact with the no-go stimulus. We conjecture that this may amplify differences in evoked neural activity between trial types.
Biological specimens are rife with optical inhomogeneities that seriously degrade imaging performance under all but the most ideal conditions. Measuring and then correcting for these inhomogeneities is the province of adaptive optics. Here we introduce an approach to adaptive optics in microscopy wherein the rear pupil of an objective lens is segmented into subregions, and light is directed individually to each subregion to measure, by image shift, the deflection faced by each group of rays as they emerge from the objective and travel through the specimen toward the focus. Applying our method to two-photon microscopy, we could recover near-diffraction-limited performance from a variety of biological and nonbiological samples exhibiting aberrations large or small and smoothly varying or abruptly changing. In particular, results from fixed mouse cortical slices illustrate our ability to improve signal and resolution to depths of 400 microm.
Biological specimens are rife with optical inhomogeneities that seriously degrade imaging performance under all but the most ideal conditions. Measuring and then correcting for these inhomogeneities is the province of adaptive optics. Here we introduce an approach to adaptive optics in microscopy wherein the rear pupil of an objective lens is segmented into subregions, and light is directed individually to each subregion to measure, by image shift, the deflection faced by each group of rays as they emerge from the objective and travel through the specimen toward the focus. Applying our method to two-photon microscopy, we could recover near-diffraction-limited performance from a variety of biological and nonbiological samples exhibiting aberrations large or small and smoothly varying or abruptly changing. In particular, results from fixed mouse cortical slices illustrate our ability to improve signal and resolution to depths of 400 microm.
Commentary: Introduces a new, zonal approach to adaptive optics (AO) in microscopy suitable for highly inhomogeneous and/or scattering samples such as living tissue. The method is unique in its ability to handle large amplitude aberrations (>20 wavelengths), including spatially complex aberrations involving high order modes beyond the ability of most AO actuators to correct. As befitting a technique designed for in vivo fluorescence imaging, it is also photon efficient.
Although used here in conjunction with two photon microscopy to demonstrate correction deep into scattering tissue, the same principle of pupil segmentation might be profitably adapted to other point-scanning or widefield methods. For example, plane illumination microscopy of multicellular specimens is often beset by substantial aberrations, and all far-field superresolution methods are exquisitely sensitive to aberrations.
Neurons derived from the same progenitor may acquire different fates according to their birth timing/order. To reveal temporally guided cell fates, we must determine neuron types as well as their lineage relationships and times of birth. Recent advances in genetic lineage analysis and fate mapping are facilitating such studies. For example, high-resolution lineage analysis can identify each sequentially derived neuron of a lineage and has revealed abrupt temporal identity changes in diverse Drosophila neuronal lineages. In addition, fate mapping of mouse neurons made from the same pool of precursors shows production of specific neuron types in specific temporal patterns. The tools used in these analyses are helping to further our understanding of the genetics of neuronal temporal identity.
Automatic alignment (registration) of 3D images of adult fruit fly brains is often influenced by the significant displacement of the relative locations of the two optic lobes (OLs) and the center brain (CB). In one of our ongoing efforts to produce a better image alignment pipeline of adult fruit fly brains, we consider separating CB and OLs and align them independently. This paper reports our automatic method to segregate CB and OLs, in particular under conditions where the signal to noise ratio (SNR) is low, the variation of the image intensity is big, and the relative displacement of OLs and CB is substantial. We design an algorithm to find a minimum-cost 3D surface in a 3D image stack to best separate an OL (of one side, either left or right) from CB. This surface is defined as an aggregation of the respective minimum-cost curves detected in each individual 2D image slice. Each curve is defined by a list of control points that best segregate OL and CB. To obtain the locations of these control points, we derive an energy function that includes an image energy term defined by local pixel intensities and two internal energy terms that constrain the curve’s smoothness and length. Gradient descent method is used to optimize this energy function. To improve both the speed and robustness of the method, for each stack, the locations of optimized control points in a slice are taken as the initialization prior for the next slice. We have tested this approach on simulated and real 3D fly brain image stacks and demonstrated that this method can reasonably segregate OLs from CBs despite the aforementioned difficulties.