Filter
Associated Lab
- Aguilera Castrejon Lab (1) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (53) Apply Ahrens Lab filter
- Aso Lab (40) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (101) Apply Betzig Lab filter
- Beyene Lab (8) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (50) Apply Branson Lab filter
- Card Lab (36) Apply Card Lab filter
- Cardona Lab (45) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (14) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (38) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (15) Apply Espinosa Medina Lab filter
- Feliciano Lab (7) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- Fitzgerald Lab (16) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (38) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (53) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (22) Apply Hermundstad Lab filter
- Hess Lab (74) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (42) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (61) Apply Keller Lab filter
- Koay Lab (2) Apply Koay Lab filter
- Lavis Lab (137) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (4) Apply Li Lab filter
- Lippincott-Schwartz Lab (97) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (1) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (58) Apply Liu (Zhe) Lab filter
- Looger Lab (137) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (6) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (36) Apply Pachitariu Lab filter
- Pastalkova Lab (5) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (4) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (45) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (31) Apply Romani Lab filter
- Rubin Lab (105) Apply Rubin Lab filter
- Saalfeld Lab (46) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (50) Apply Schreiter Lab filter
- Sgro Lab (1) Apply Sgro Lab filter
- Shroff Lab (31) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (57) Apply Spruston Lab filter
- Stern Lab (73) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (32) Apply Stringer Lab filter
- Svoboda Lab (131) Apply Svoboda Lab filter
- Tebo Lab (9) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (18) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (39) Apply Turaga Lab filter
- Turner Lab (27) Apply Turner Lab filter
- Vale Lab (7) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (21) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (6) Apply Wang (Shaohe) Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (3) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (11) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (53) Apply FlyEM filter
- FlyLight (49) Apply FlyLight filter
- GENIE (46) Apply GENIE filter
- Integrative Imaging (4) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (26) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Project Pipeline Support (5) Apply Project Pipeline Support filter
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (35) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (16) Apply Electron Microscopy filter
- Gene Targeting and Transgenics (11) Apply Gene Targeting and Transgenics filter
- Integrative Imaging (17) Apply Integrative Imaging filter
- Invertebrate Shared Resource (40) Apply Invertebrate Shared Resource filter
- Janelia Experimental Technology (37) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Primary & iPS Cell Culture (14) Apply Primary & iPS Cell Culture filter
- Project Technical Resources (50) Apply Project Technical Resources filter
- Quantitative Genomics (19) Apply Quantitative Genomics filter
- Scientific Computing Software (92) Apply Scientific Computing Software filter
- Scientific Computing Systems (7) Apply Scientific Computing Systems filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (7) Apply Vivarium filter
Publication Date
- 2025 (124) Apply 2025 filter
- 2024 (215) Apply 2024 filter
- 2023 (159) Apply 2023 filter
- 2022 (167) Apply 2022 filter
- 2021 (175) Apply 2021 filter
- 2020 (177) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2689 Janelia Publications
Showing 111-120 of 2689 resultsThe ventromedial hypothalamus (VMH) projects to the periaqueductal gray (PAG) and anterior hypothalamic nucleus (AHN), mediating freezing and escape behaviors, respectively. We investigated VMH collateral (VMH-coll) neurons, which innervate both PAG and AHN, to elucidate their role in postsynaptic processing and defensive behavior plasticity. Using all-optical voltage imaging of 22,151 postsynaptic neurons ex vivo, we found that VMH-coll neurons engage inhibitory mechanisms at both synaptic ends and can induce synaptic circuit plasticity. In vivo optogenetic activation of the VMH-coll somas induced escape behaviors. We identified an Esr1-expressing VMH-coll subpopulation with postsynaptic connectome resembling that of wild-type collaterals on the PAG side. Activation of Esr1+VMH-coll neurons evoked freezing and unexpected flattening behavior, previously not linked to the VMH. Neuropeptides such as PACAP and dynorphin modulated both Esr1+VMH-coll connectomes. In vivo κ-opioid receptor antagonism impaired Esr1+VMH-coll-mediated defensive behaviors. These findings unveiled the central role of VMH-coll pathways in innate defensive behavior plasticity.
Artificial neural networks learn faster if they are initialized well. Good initializations can generate high-dimensional macroscopic dynamics with long timescales. It is not known if biological neural networks have similar properties. Here we show that the eigenvalue spectrum and dynamical properties of large-scale neural recordings in mice (two-photon and electrophysiology) are similar to those produced by linear dynamics governed by a random symmetric matrix that is critically normalized. An exception was hippocampal area CA1: population activity in this area resembled an efficient, uncorrelated neural code, which may be optimized for information storage capacity. Global emergent activity modes persisted in simulations with sparse, clustered or spatial connectivity. We hypothesize that the spontaneous neural activity reflects a critical initialization of whole-brain neural circuits that is optimized for learning time-dependent tasks.
Optical aberrations hinder fluorescence microscopy of thick samples, reducing image signal, contrast, and resolution. Here we introduce a deep learning-based strategy for aberration compensation, improving image quality without slowing image acquisition, applying additional dose, or introducing more optics. Our method (i) introduces synthetic aberrations to images acquired on the shallow side of image stacks, making them resemble those acquired deeper into the volume and (ii) trains neural networks to reverse the effect of these aberrations. We use simulations and experiments to show that applying the trained ‘de-aberration’ networks outperforms alternative methods, providing restoration on par with adaptive optics techniques; and subsequently apply the networks to diverse datasets captured with confocal, light-sheet, multi-photon, and super-resolution microscopy. In all cases, the improved quality of the restored data facilitates qualitative image inspection and improves downstream image quantitation, including orientational analysis of blood vessels in mouse tissue and improved membrane and nuclear segmentation in C. elegans embryos.
During development, cells undergo a sequence of specification events to form functional tissues and organs. To investigate complex tissue development, it is crucial to visualize how cell lineages emerge and to be able to manipulate regulatory factors with temporal control. We recently developed TEMPO (Temporal Encoding and Manipulation in a Predefined Order), a genetic tool to label with different colors and genetically manipulate consecutive cell generations in vertebrates. TEMPO relies on CRISPR to activate a cascade of fluorescent proteins which can be imaged in vivo. Here, we explain the steps to design, generate, and express TEMPO constructs in zebrafish and mice.
Fluorescence microscopy is essential for biological research, offering high-contrast imaging of microscopic structures. However, the quality of these images is often compromised by optical aberrations and noise, particularly in low signal-to-noise ratio (SNR) conditions. While adaptive optics (AO) can correct aberrations, it requires costly hardware and slows down imaging; whereas current denoising approaches boost the SNR but leave out the aberration compensation. To address these limitations, we introduce HD2Net, a deep learning framework that enhances image quality by simultaneously denoising and suppressing the effect of aberrations without the need for additional hardware. Building on our previous work, HD2Net incorporates noise estimation and aberration removal modules, effectively restoring images degraded by noise and aberrations. Through comprehensive evaluation of synthetic phantoms and biological data, we demonstrate that HD2Net outperforms existing methods, significantly improving image resolution and contrast. This framework offers a promising solution for enhancing biological imaging, particularly in challenging aberrating and low-light conditions.
Fluorescence microscopy is essential for biological research, offering high-contrast imaging of microscopic structures. However, the quality of these images is often compromised by optical aberrations and noise, particularly in low signal-to-noise ratio (SNR) conditions. While adaptive optics (AO) can correct aberrations, it requires costly hardware and slows down imaging; whereas current denoising approaches boost the SNR but leave out the aberration compensation. To address these limitations, we introduce HD2Net, a deep learning framework that enhances image quality by simultaneously denoising and suppressing the effect of aberrations without the need for additional hardware. Building on our previous work, HD2Net incorporates noise estimation and aberration removal modules, effectively restoring images degraded by noise and aberrations. Through comprehensive evaluation of synthetic phantoms and biological data, we demonstrate that HD2Net outperforms existing methods, significantly improving image resolution and contrast. This framework offers a promising solution for enhancing biological imaging, particularly in challenging aberrating and low-light conditions.
Zebrafish larvae are used to model the pathogenesis of multiple bacteria. This transparent model offers the unique advantage of allowing quantification of fluorescent bacterial burdens (fluorescent pixel counts [FPC]) by facile microscopical methods, replacing enumeration of bacteria using time-intensive plating of lysates on bacteriological media. Accurate FPC measurements require laborious manual image processing to mark the outside borders of the animals so as to delineate the bacteria inside the animals from those in the culture medium that they are in. Here, we have developed an automated ImageJ/Fiji-based macro that accurately detects the outside borders of -infected larvae.
Animals need to rapidly learn to recognize and avoid predators. This ability may be especially important for young animals due to their increased vulnerability. It is unknown whether, and how, nascent vertebrates are capable of such rapid learning. Here, we used a robotic predator-prey interaction assay to show that 1 week after fertilization-a developmental stage where they have approximately 1% the number of neurons of adults-zebrafish larvae rapidly and robustly learn to recognize a stationary object as a threat after the object pursues the fish for ∼1 min. Larvae continue to avoid the threatening object after it stops moving and can learn to distinguish threatening from non-threatening objects of a different color. Whole-brain functional imaging revealed the multi-timescale activity of noradrenergic neurons and forebrain circuits that encoded the threat. Chemogenetic ablation of those populations prevented the learning. Thus, a noradrenergic and forebrain multiregional network underlies the ability of young vertebrates to rapidly learn to recognize potential predators within their first week of life.
Effective classification of neuronal cell types requires both molecular and morphological descriptors to be collected in situ at single cell resolution. However, current spatial transcriptomics techniques are not compatible with imaging workflows that successfully reconstruct the morphology of complete axonal projections. Here, we introduce a new methodology that combines tissue clearing, submicron whole-brain two photon imaging, and Expansion-Assisted Iterative Fluorescence In Situ Hybridization (EASI-FISH) to assign molecular identities to fully reconstructed neurons in the mouse brain, which we call morphoFISH. We used morphoFISH to molecularly identify a previously unknown population of cingulate neurons projecting ipsilaterally to the dorsal striatum and contralaterally to higher-order thalamus. By pairing whole-brain morphometry, improved techniques for nucleic acid preservation and spatial gene expression, morphoFISH offers a quantitative solution for discovery of multimodal cell types and complements existing techniques for characterization of increasingly fine-grained cellular heterogeneity in brain circuits.Competing Interest StatementThe authors have declared no competing interest.