Filter
Associated Lab
- Ahrens Lab (41) Apply Ahrens Lab filter
- Aso Lab (39) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (98) Apply Betzig Lab filter
- Beyene Lab (4) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (45) Apply Branson Lab filter
- Card Lab (32) Apply Card Lab filter
- Cardona Lab (44) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (10) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (34) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (12) Apply Espinosa Medina Lab filter
- Feliciano Lab (6) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- Fitzgerald Lab (14) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (33) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (47) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (17) Apply Hermundstad Lab filter
- Hess Lab (65) Apply Hess Lab filter
- Jayaraman Lab (39) Apply Jayaraman Lab filter
- Ji Lab (32) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (60) Apply Keller Lab filter
- Lavis Lab (119) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (1) Apply Li Lab filter
- Lippincott-Schwartz Lab (83) Apply Lippincott-Schwartz Lab filter
- Liu (Zhe) Lab (51) Apply Liu (Zhe) Lab filter
- Looger Lab (136) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (3) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (26) Apply Pachitariu Lab filter
- Pastalkova Lab (5) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (1) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (42) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (28) Apply Romani Lab filter
- Rubin Lab (100) Apply Rubin Lab filter
- Saalfeld Lab (41) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (44) Apply Schreiter Lab filter
- Shroff Lab (18) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (55) Apply Spruston Lab filter
- Stern Lab (67) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (21) Apply Stringer Lab filter
- Svoboda Lab (131) Apply Svoboda Lab filter
- Tebo Lab (6) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (12) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (34) Apply Turaga Lab filter
- Turner Lab (24) Apply Turner Lab filter
- Vale Lab (6) Apply Vale Lab filter
- Voigts Lab (1) Apply Voigts Lab filter
- Wang (Meng) Lab (6) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (1) Apply Wang (Shaohe) Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (1) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- Fly Descending Interneuron (10) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (48) Apply FlyEM filter
- FlyLight (45) Apply FlyLight filter
- GENIE (38) Apply GENIE filter
- Integrative Imaging (1) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (16) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (21) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (31) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (10) Apply Electron Microscopy filter
- Fly Facility (39) Apply Fly Facility filter
- Gene Targeting and Transgenics (10) Apply Gene Targeting and Transgenics filter
- Integrative Imaging (10) Apply Integrative Imaging filter
- Janelia Experimental Technology (35) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Primary & iPS Cell Culture (13) Apply Primary & iPS Cell Culture filter
- Project Technical Resources (31) Apply Project Technical Resources filter
- Quantitative Genomics (18) Apply Quantitative Genomics filter
- Scientific Computing Software (56) Apply Scientific Computing Software filter
- Scientific Computing Systems (6) Apply Scientific Computing Systems filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (6) Apply Vivarium filter
Publication Date
- 2024 (64) Apply 2024 filter
- 2023 (178) Apply 2023 filter
- 2022 (166) Apply 2022 filter
- 2021 (174) Apply 2021 filter
- 2020 (178) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2432 Janelia Publications
Showing 1801-1810 of 2432 resultsThe comprehensive reconstruction of cell lineages in complex multicellular organisms is a central goal of developmental biology. We present an open-source computational framework for the segmentation and tracking of cell nuclei with high accuracy and speed. We demonstrate its (i) generality by reconstructing cell lineages in four-dimensional, terabyte-sized image data sets of fruit fly, zebrafish and mouse embryos acquired with three types of fluorescence microscopes, (ii) scalability by analyzing advanced stages of development with up to 20,000 cells per time point at 26,000 cells min(-1) on a single computer workstation and (iii) ease of use by adjusting only two parameters across all data sets and providing visualization and editing tools for efficient data curation. Our approach achieves on average 97.0% linkage accuracy across all species and imaging modalities. Using our system, we performed the first cell lineage reconstruction of early Drosophila melanogaster nervous system development, revealing neuroblast dynamics throughout an entire embryo.
Compared to the dorsal hippocampus, relatively few studies have characterized neuronal responses in the ventral hippocampus. In particular, it is unclear whether and how cells in the ventral region represent space and/or respond to contextual changes. We recorded from dorsal and ventral CA1 neurons in freely moving mice exposed to manipulations of visuospatial and olfactory contexts. We found that ventral cells respond to alterations of the visuospatial environment such as exposure to novel local cues, cue rotations, and contextual expansion in similar ways to dorsal cells, with the exception of cue rotations. Furthermore, we found that ventral cells responded to odors much more strongly than dorsal cells, particularly to odors of high valence. Similar to earlier studies recording from the ventral hippocampus in CA3, we also found increased scaling of place cell field size along the longitudinal hippocampal axis. Although the increase in place field size observed toward the ventral pole has previously been taken to suggest a decrease in spatial information coded by ventral place cells, we hypothesized that a change in spatial scaling could instead signal a shift in representational coding that preserves the resolution of spatial information. To explore this possibility, we examined population activity using principal component analysis (PCA) and neural location reconstruction techniques. Our results suggest that ventral populations encode a distributed representation of space, and that the resolution of spatial information at the population level is comparable to that of dorsal populations of similar size. Finally, through the use of neural network modeling, we suggest that the redundancy in spatial representation along the longitudinal hippocampal axis may allow the hippocampus to overcome the conflict between memory interference and generalization inherent in neural network memory. Our results suggest that ventral population activity is well suited for generalization across locations and contexts. © 2014 Wiley Periodicals, Inc.
We discovered a bimodal behavior in the genetically tractable organism Drosophila melanogaster that allowed us to directly probe the neural mechanisms of an action selection process. When confronted by a predator-mimicking looming stimulus, a fly responds with either a long-duration escape behavior sequence that initiates stable flight or a distinct, short-duration sequence that sacrifices flight stability for speed. Intracellular recording of the descending giant fiber (GF) interneuron during head-fixed escape revealed that GF spike timing relative to parallel circuits for escape actions determined which of the two behavioral responses was elicited. The process was well described by a simple model in which the GF circuit has a higher activation threshold than the parallel circuits, but can override ongoing behavior to force a short takeoff. Our findings suggest a neural mechanism for action selection in which relative activation timing of parallel circuits creates the appropriate motor output.
To provide a temporal framework for the genoarchitecture of brain development, we generated in situ hybridization data for embryonic and postnatal mouse brain at seven developmental stages for ∼2,100 genes, which were processed with an automated informatics pipeline and manually annotated. This resource comprises 434,946 images, seven reference atlases, an ontogenetic ontology, and tools to explore coexpression of genes across neurodevelopment. Gene sets coinciding with developmental phenomena were identified. A temporal shift in the principles governing the molecular organization of the brain was detected, with transient neuromeric, plate-based organization of the brain present at E11.5 and E13.5. Finally, these data provided a transcription factor code that discriminates brain structures and identifies the developmental age of a tissue, providing a foundation for eventual genetic manipulation or tracking of specific brain structures over development. The resource is available as the Allen Developing Mouse Brain Atlas (http://developingmouse.brain-map.org).
Only a few years after its inception, localization-based super-resolution microscopy has become widely employed in biological studies. Yet, it is primarily used in two-dimensional imaging and accessing the organization of cellular structures at the nanoscale in three dimensions (3D) still poses important challenges. Here, we review optical and computational techniques that enable the 3D localization of individual emitters and the reconstruction of 3D super-resolution images. These techniques are grouped into three main categories: PSF engineering, multiple plane imaging and interferometric approaches. We provide an overview of their technical implementation as well as commentary on their applicability. Finally, we discuss future trends in 3D localization-based super-resolution microscopy.
During many natural behaviors the relevant sensory stimuli and motor outputs are difficult to quantify. Furthermore, the high dimensionality of the space of possible stimuli and movements compounds the problem of experimental control. Head fixation facilitates stimulus control and movement tracking, and can be combined with techniques for recording and manipulating neural activity. However, head-fixed mouse behaviors are typically trained through extensive instrumental conditioning. Here we present a whisker-based, tactile virtual reality system for head-fixed mice running on a spherical treadmill. Head-fixed mice displayed natural movements, including running and rhythmic whisking at 16 Hz. Whisking was centered on a set point that changed in concert with running so that more protracted whisking was correlated with faster running. During turning, whiskers moved in an asymmetric manner, with more retracted whisker positions in the turn direction and protracted whisker movements on the other side. Under some conditions, whisker movements were phase-coupled to strides. We simulated a virtual reality tactile corridor, consisting of two moveable walls controlled in a closed-loop by running speed and direction. Mice used their whiskers to track the walls of the winding corridor without training. Whisker curvature changes, which cause forces in the sensory follicles at the base of the whiskers, were tightly coupled to distance from the walls. Our behavioral system allows for precise control of sensorimotor variables during natural tactile navigation.
Retinal bipolar cells (BCs) transmit visual signals in parallel channels from the outer to the inner retina, where they provide glutamatergic inputs to specific networks of amacrine and ganglion cells. Intricate network computation at BC axon terminals has been proposed as a mechanism for complex network computation, such as direction selectivity, but direct knowledge of the receptive field property and the synaptic connectivity of the axon terminals of various BC types is required in order to understand the role of axonal computation by BCs. The present study tested the essential assumptions of the presynaptic model of direction selectivity at axon terminals of three functionally distinct BC types that ramify in the direction-selective strata of the mouse retina. Results from two-photon Ca2+ imaging, optogenetic stimulation, and dual patch-clamp recording demonstrated that (1) CB5 cells do not receive fast GABAergic synaptic feedback from starburst amacrine cells (SACs), (2) light-evoked and spontaneous Ca2+ responses are well coordinated among various local regions of CB5 axon terminals, (3) CB5 axon terminals are not directionally selective, (4) CB5 cells consist of two novel functional subtypes with distinct receptive field structures, (5) CB7 cells provide direct excitatory synaptic inputs to, but receive no direct GABAergic synaptic feedback from SACs, and (6) CB7 axon terminals are not directionally selective either. These findings help to simplify models of direction selectivity by ruling out complex computation at BC terminals. They also show that CB5 comprises two functional subclasses of BCs.
We propose a version of least-mean-square (LMS) algorithm for sparse system identification. Our algorithm called online linearized Bregman iteration (OLBI) is derived from minimizing the cumulative prediction error squared along with an l 1 -l 2 norm regularizer. By systematically treating the non-differentiable regularizer we arrive at a simple two-step iteration. We demonstrate that OLBI is bias free and compare its operation with existing sparse LMS algorithms by rederiving them in the online convex optimization framework. We perform convergence analysis of OLBI for white input signals and derive theoretical expressions for the steady state mean square deviations (MSD). We demonstrate numerically that OLBI improves the performance of LMS type algorithms for signals generated from sparse tap weights.
Rapidly and selectively modulating the activity of defined neurons in unrestrained animals is a powerful approach in investigating the circuit mechanisms that shape behavior. In Drosophila melanogaster, temperature-sensitive silencers and activators are widely used to control the activities of genetically defined neuronal cell types. A limitation of these thermogenetic approaches, however, has been their poor temporal resolution. Here we introduce FlyMAD (the fly mind-altering device), which allows thermogenetic silencing or activation within seconds or even fractions of a second. Using computer vision, FlyMAD targets an infrared laser to freely walking flies. As a proof of principle, we demonstrated the rapid silencing and activation of neurons involved in locomotion, vision and courtship. The spatial resolution of the focused beam enabled preferential targeting of neurons in the brain or ventral nerve cord. Moreover, the high temporal resolution of FlyMAD allowed us to discover distinct timing relationships for two neuronal cell types previously linked to courtship song.
Three-dimensional (3D) bioimaging, visualization and data analysis are in strong need of powerful 3D exploration techniques. We develop virtual finger (VF) to generate 3D curves, points and regions-of-interest in the 3D space of a volumetric image with a single finger operation, such as a computer mouse stroke, or click or zoom from the 2D-projection plane of an image as visualized with a computer. VF provides efficient methods for acquisition, visualization and analysis of 3D images for roundworm, fruitfly, dragonfly, mouse, rat and human. Specifically, VF enables instant 3D optical zoom-in imaging, 3D free-form optical microsurgery, and 3D visualization and annotation of terabytes of whole-brain image volumes. VF also leads to orders of magnitude better efficiency of automated 3D reconstruction of neurons and similar biostructures over our previous systems. We use VF to generate from images of 1,107 Drosophila GAL4 lines a projectome of a Drosophila brain.