Filter
Associated Lab
- Aguilera Castrejon Lab (1) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (52) Apply Ahrens Lab filter
- Aso Lab (40) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (100) Apply Betzig Lab filter
- Beyene Lab (8) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (49) Apply Branson Lab filter
- Card Lab (35) Apply Card Lab filter
- Cardona Lab (44) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (13) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (38) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (15) Apply Espinosa Medina Lab filter
- Feliciano Lab (7) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- Fitzgerald Lab (16) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (38) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (50) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (22) Apply Hermundstad Lab filter
- Hess Lab (73) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (42) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (61) Apply Keller Lab filter
- Koay Lab (2) Apply Koay Lab filter
- Lavis Lab (136) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (4) Apply Li Lab filter
- Lippincott-Schwartz Lab (95) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (1) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (56) Apply Liu (Zhe) Lab filter
- Looger Lab (137) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (5) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (35) Apply Pachitariu Lab filter
- Pastalkova Lab (5) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (4) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (45) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (31) Apply Romani Lab filter
- Rubin Lab (105) Apply Rubin Lab filter
- Saalfeld Lab (46) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (50) Apply Schreiter Lab filter
- Sgro Lab (1) Apply Sgro Lab filter
- Shroff Lab (30) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (57) Apply Spruston Lab filter
- Stern Lab (73) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (31) Apply Stringer Lab filter
- Svoboda Lab (131) Apply Svoboda Lab filter
- Tebo Lab (9) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (18) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (38) Apply Turaga Lab filter
- Turner Lab (26) Apply Turner Lab filter
- Vale Lab (7) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (18) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (6) Apply Wang (Shaohe) Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (2) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (10) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (53) Apply FlyEM filter
- FlyLight (49) Apply FlyLight filter
- GENIE (45) Apply GENIE filter
- Integrative Imaging (3) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (26) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Project Pipeline Support (3) Apply Project Pipeline Support filter
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (34) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (15) Apply Electron Microscopy filter
- Gene Targeting and Transgenics (11) Apply Gene Targeting and Transgenics filter
- Integrative Imaging (17) Apply Integrative Imaging filter
- Invertebrate Shared Resource (40) Apply Invertebrate Shared Resource filter
- Janelia Experimental Technology (37) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Primary & iPS Cell Culture (14) Apply Primary & iPS Cell Culture filter
- Project Technical Resources (47) Apply Project Technical Resources filter
- Quantitative Genomics (19) Apply Quantitative Genomics filter
- Scientific Computing Software (91) Apply Scientific Computing Software filter
- Scientific Computing Systems (6) Apply Scientific Computing Systems filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (7) Apply Vivarium filter
Publication Date
- 2025 (85) Apply 2025 filter
- 2024 (221) Apply 2024 filter
- 2023 (160) Apply 2023 filter
- 2022 (167) Apply 2022 filter
- 2021 (175) Apply 2021 filter
- 2020 (177) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2657 Janelia Publications
Showing 391-400 of 2657 resultsWhile cytokinesis has been intensely studied, the way it is executed during development is not well understood, despite a long-standing appreciation that various aspects of cytokinesis vary across cell and tissue types. To address this, we investigated cytokinesis during the invariant embryonic divisions and found several reproducibly altered parameters at different stages. During early divisions, furrow ingression asymmetry and midbody inheritance is consistent, suggesting specific regulation of these events. During morphogenesis, we found several unexpected alterations to cytokinesis including apical midbody migration in polarizing epithelial cells of the gut, pharynx and sensory neurons. Aurora B kinase, which is essential for several aspects of cytokinesis, remains apically localized in each of these tissues after internalization of midbody ring components. Aurora B inactivation disrupts cytokinesis and causes defects in apical structures, even if inactivated post-mitotically. Therefore, cytokinesis is implemented in a specialized way during epithelial polarization and Aurora B has a new role in the formation of the apical surface.
The most established method of reconstructing neural circuits from animals involves slicing tissue very thin, then taking mosaics of electron microscope (EM) images. To trace neurons across different images and through different sections, these images must be accurately aligned, both with the others in the same section and to the sections above and below. Unfortunately, sectioning and imaging are not ideal processes - some of the problems that make alignment difficult include lens distortion, tissue shrinkage during imaging, tears and folds in the sectioned tissue, and dust and other artifacts. In addition the data sets are large (hundreds of thousands of images) and each image must be aligned with many neighbors, so the process must be automated and reliable. This paper discusses methods of dealing with these problems, with numeric results describing the accuracy of the resulting alignments.
The cerebellum plays an important role in both motor control and cognitive function. Cerebellar function is topographically organized and diseases that affect specific parts of the cerebellum are associated with specific patterns of symptoms. Accordingly, delineation and quantification of cerebellar sub-regions from magnetic resonance images are important in the study of cerebellar atrophy and associated functional losses. This paper describes an automated cerebellar lobule segmentation method based on a graph cut segmentation framework. Results from multi-atlas labeling and tissue classification contribute to the region terms in the graph cut energy function and boundary classification contributes to the boundary term in the energy function. A cerebellar parcellation is achieved by minimizing the energy function using the α-expansion technique. The proposed method was evaluated using a leave-one-out cross-validation on 15 subjects including both healthy controls and patients with cerebellar diseases. Based on reported Dice coefficients, the proposed method outperforms two state-of-the-art methods. The proposed method was then applied to 2(j) 77 subjects to study the region-specific cerebellar structural differences in three spinocerebellar ataxia (SCA) genetic subtypes. Quantitative analysis of the lobule volumes show distinct patterns of volume changes associated with different SCA subtypes consistent with known patterns of atrophy in these genetic subtypes.
We describe a method for fully automated detection of chemical synapses in serial electron microscopy images with highly anisotropic axial and lateral resolution, such as images taken on transmission electron microscopes. Our pipeline starts from classification of the pixels based on 3D pixel features, which is followed by segmentation with an Ising model MRF and another classification step, based on object-level features. Classifiers are learned on sparse user labels; a fully annotated data subvolume is not required for training. The algorithm was validated on a set of 238 synapses in 20 serial 7197×7351 pixel images (4.5×4.5×45 nm resolution) of mouse visual cortex, manually labeled by three independent human annotators and additionally re-verified by an expert neuroscientist. The error rate of the algorithm (12% false negative, 7% false positive detections) is better than state-of-the-art, even though, unlike the state-of-the-art method, our algorithm does not require a prior segmentation of the image volume into cells. The software is based on the ilastik learning and segmentation toolkit and the vigra image processing library and is freely available on our website, along with the test data and gold standard annotations (http://www.ilastik.org/synapse-detection/sstem).
View Publication PageHigh-resolution microscopic imaging of biological samples often produces multiple 3D image tiles to cover a large field of view of specimen. Usually each tile has a large size, in the range of hundreds of megabytes to several gigabytes. For many of our image data sets, existing software tools are often unable to stitch those 3D tiles into a panoramic view, thus impede further data analysis. We propose a simple, but accurate, robust, and automatic method to stitch a group of image tiles without a priori adjacency information of them. We first use a multiscale strategy to register a pair of 3D image tiles rapidly, achieving about 8~10 times faster speed and 10 times less memory requirement compared to previous methods. Then we design a minimum-spanning-tree based method to determine the optimal adjacency of tiles. We have successfully stitched large image stacks of model animals including C. elegans, fruit fly, dragonfly, and mouse, which could not be stitched by several existing methods.
The behavior of individuals determines the strength and outcome of ecological interactions, which drive population, community, and ecosystem organization. Bio-logging, such as telemetry and animal-borne imaging, provides essential individual viewpoints, tracks, and life histories, but requires capture of individuals and is often impractical to scale. Recent developments in automated image-based tracking offers opportunities to remotely quantify and understand individual behavior at scales and resolutions not previously possible, providing an essential supplement to other tracking methodologies in ecology. Automated image-based tracking should continue to advance the field of ecology by enabling better understanding of the linkages between individual and higher-level ecological processes, via high-throughput quantitative analysis of complex ecological patterns and processes across scales, including analysis of environmental drivers.
A quantitative description of animal social behaviour is informative for behavioural biologists and clinicians developing drugs to treat social disorders. Social interaction in a group of animals has been difficult to measure because behaviour develops over long periods of time and requires tedious manual scoring, which is subjective and often non-reproducible. Computer-vision systems with the ability to measure complex social behaviour automatically would have a transformative impact on biology. Here, we present a method for tracking group-housed mice individually as they freely interact over multiple days. Each mouse is bleach-marked with a unique fur pattern. The patterns are automatically learned by the tracking software and used to infer identities. Trajectories are analysed to measure behaviour as it develops over days, beyond the range of acute experiments. We demonstrate how our system may be used to study the development of place preferences, associations and social relationships by tracking four mice continuously for five days. Our system enables accurate and reproducible characterisation of wild-type mouse social behaviour and paves the way for high-throughput long-term observation of the effects of genetic, pharmacological and environmental manipulations.
Reconstruction of neural circuitry at single-synapse resolution is an attractive target for improving understanding of the nervous system in health and disease. Serial section transmission electron microscopy (ssTEM) is among the most prolific imaging methods employed in pursuit of such reconstructions. We demonstrate how Flood-Filling Networks (FFNs) can be used to computationally segment a forty-teravoxel whole-brain Drosophila ssTEM volume. To compensate for data irregularities and imperfect global alignment, FFNs were combined with procedures that locally re-align serial sections and dynamically adjust image content. The proposed approach produced a largely merger-free segmentation of the entire ssTEM Drosophila brain, which we make freely available. As compared to manual tracing using an efficient skeletonization strategy, the segmentation enabled circuit reconstruction and analysis workflows that were an order of magnitude faster.
Digital reconstruction of neurons from microscope images is an important and challenging problem in neuroscience. In this paper, we propose a model-based method to tackle this problem. We first formulate a model structure, then develop an algorithm for computing it by carefully taking into account morphological characteristics of neurons, as well as the image properties under typical imaging protocols. The method has been tested on the data sets used in the DIADEM competition and produced promising results for four out of the five data sets.
We present a method to automatically identify and track nuclei in time-lapse microscopy recordings of entire developing embryos. The method combines deep learning and global optimization. On a mouse dataset, it reconstructs 75.8% of cell lineages spanning 1 h, as compared to 31.8% for the competing method. Our approach improves understanding of where and when cell fate decisions are made in developing embryos, tissues, and organs.