Filter
Associated Lab
- Aguilera Castrejon Lab (14) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (52) Apply Ahrens Lab filter
- Aso Lab (39) Apply Aso Lab filter
- Baker Lab (38) Apply Baker Lab filter
- Betzig Lab (110) Apply Betzig Lab filter
- Beyene Lab (9) Apply Beyene Lab filter
- Bock Lab (17) Apply Bock Lab filter
- Branson Lab (48) Apply Branson Lab filter
- Card Lab (38) Apply Card Lab filter
- Cardona Lab (63) Apply Cardona Lab filter
- Chklovskii Lab (13) Apply Chklovskii Lab filter
- Clapham Lab (11) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (12) Apply Darshan Lab filter
- Dennis Lab (1) Apply Dennis Lab filter
- Dickson Lab (46) Apply Dickson Lab filter
- Druckmann Lab (25) Apply Druckmann Lab filter
- Dudman Lab (46) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (11) Apply Egnor Lab filter
- Espinosa Medina Lab (16) Apply Espinosa Medina Lab filter
- Feliciano Lab (6) Apply Feliciano Lab filter
- Fetter Lab (41) Apply Fetter Lab filter
- Fitzgerald Lab (27) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (33) Apply Funke Lab filter
- Gonen Lab (91) Apply Gonen Lab filter
- Grigorieff Lab (62) Apply Grigorieff Lab filter
- Harris Lab (57) Apply Harris Lab filter
- Heberlein Lab (94) Apply Heberlein Lab filter
- Hermundstad Lab (21) Apply Hermundstad Lab filter
- Hess Lab (68) Apply Hess Lab filter
- Jayaraman Lab (43) Apply Jayaraman Lab filter
- Ji Lab (32) Apply Ji Lab filter
- Johnson Lab (6) Apply Johnson Lab filter
- Kainmueller Lab (19) Apply Kainmueller Lab filter
- Karpova Lab (14) Apply Karpova Lab filter
- Keleman Lab (13) Apply Keleman Lab filter
- Keller Lab (75) Apply Keller Lab filter
- Koay Lab (16) Apply Koay Lab filter
- Lavis Lab (132) Apply Lavis Lab filter
- Lee (Albert) Lab (34) Apply Lee (Albert) Lab filter
- Leonardo Lab (23) Apply Leonardo Lab filter
- Li Lab (25) Apply Li Lab filter
- Lippincott-Schwartz Lab (155) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (5) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (56) Apply Liu (Zhe) Lab filter
- Looger Lab (137) Apply Looger Lab filter
- Magee Lab (49) Apply Magee Lab filter
- Menon Lab (18) Apply Menon Lab filter
- Murphy Lab (13) Apply Murphy Lab filter
- O'Shea Lab (4) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (38) Apply Pachitariu Lab filter
- Pastalkova Lab (18) Apply Pastalkova Lab filter
- Pavlopoulos Lab (19) Apply Pavlopoulos Lab filter
- Pedram Lab (12) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (48) Apply Reiser Lab filter
- Riddiford Lab (44) Apply Riddiford Lab filter
- Romani Lab (40) Apply Romani Lab filter
- Rubin Lab (138) Apply Rubin Lab filter
- Saalfeld Lab (58) Apply Saalfeld Lab filter
- Satou Lab (16) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (61) Apply Schreiter Lab filter
- Sgro Lab (20) Apply Sgro Lab filter
- Shroff Lab (19) Apply Shroff Lab filter
- Simpson Lab (23) Apply Simpson Lab filter
- Singer Lab (80) Apply Singer Lab filter
- Spruston Lab (91) Apply Spruston Lab filter
- Stern Lab (150) Apply Stern Lab filter
- Sternson Lab (54) Apply Sternson Lab filter
- Stringer Lab (24) Apply Stringer Lab filter
- Svoboda Lab (135) Apply Svoboda Lab filter
- Tebo Lab (30) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (15) Apply Tillberg Lab filter
- Tjian Lab (64) Apply Tjian Lab filter
- Truman Lab (88) Apply Truman Lab filter
- Turaga Lab (46) Apply Turaga Lab filter
- Turner Lab (35) Apply Turner Lab filter
- Vale Lab (6) Apply Vale Lab filter
- Voigts Lab (1) Apply Voigts Lab filter
- Wang (Meng) Lab (6) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (20) Apply Wang (Shaohe) Lab filter
- Wu Lab (9) Apply Wu Lab filter
- Zlatic Lab (28) Apply Zlatic Lab filter
- Zuker Lab (25) Apply Zuker Lab filter
Associated Project Team
- CellMap (1) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- Fly Descending Interneuron (10) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (48) Apply FlyEM filter
- FlyLight (45) Apply FlyLight filter
- GENIE (39) Apply GENIE filter
- Integrative Imaging (1) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (16) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (22) Apply Tool Translation Team (T3) filter
- Transcription Imaging (49) Apply Transcription Imaging filter
Publication Date
- 2024 (71) Apply 2024 filter
- 2023 (179) Apply 2023 filter
- 2022 (191) Apply 2022 filter
- 2021 (192) Apply 2021 filter
- 2020 (196) Apply 2020 filter
- 2019 (199) Apply 2019 filter
- 2018 (232) Apply 2018 filter
- 2017 (214) Apply 2017 filter
- 2016 (209) Apply 2016 filter
- 2015 (250) Apply 2015 filter
- 2014 (236) Apply 2014 filter
- 2013 (194) Apply 2013 filter
- 2012 (189) Apply 2012 filter
- 2011 (190) Apply 2011 filter
- 2010 (161) Apply 2010 filter
- 2009 (158) Apply 2009 filter
- 2008 (140) Apply 2008 filter
- 2007 (106) Apply 2007 filter
- 2006 (92) Apply 2006 filter
- 2005 (67) Apply 2005 filter
- 2004 (57) Apply 2004 filter
- 2003 (58) Apply 2003 filter
- 2002 (39) Apply 2002 filter
- 2001 (28) Apply 2001 filter
- 2000 (29) Apply 2000 filter
- 1999 (14) Apply 1999 filter
- 1998 (18) Apply 1998 filter
- 1997 (16) Apply 1997 filter
- 1996 (10) Apply 1996 filter
- 1995 (18) Apply 1995 filter
- 1994 (12) Apply 1994 filter
- 1993 (10) Apply 1993 filter
- 1992 (6) Apply 1992 filter
- 1991 (11) Apply 1991 filter
- 1990 (11) Apply 1990 filter
- 1989 (6) Apply 1989 filter
- 1988 (1) Apply 1988 filter
- 1987 (7) Apply 1987 filter
- 1986 (4) Apply 1986 filter
- 1985 (5) Apply 1985 filter
- 1984 (2) Apply 1984 filter
- 1983 (2) Apply 1983 filter
- 1982 (3) Apply 1982 filter
- 1981 (3) Apply 1981 filter
- 1980 (1) Apply 1980 filter
- 1979 (1) Apply 1979 filter
- 1976 (2) Apply 1976 filter
- 1973 (1) Apply 1973 filter
- 1970 (1) Apply 1970 filter
- 1967 (1) Apply 1967 filter
Type of Publication
3843 Publications
Showing 61-70 of 3843 resultsiBiology Courses provide trainees with just-in-time learning resources to become effective researchers. These courses can help scientists build core research skills, plan their research projects and careers, and learn from scientists with diverse backgrounds.
Understanding the diversification of mammalian cell lineages is an essential to embryonic development, organ regeneration and tissue engineering. Shortly after implantation in the uterus, the pluripotent cells of the mammalian epiblast generate the three germ layers: ectoderm, mesoderm and endoderm1. Although clonal analyses suggest early specification of epiblast cells towards particular cell lineages2–4, single-cell transcriptomes do not identify lineage-specific markers in the epiblast5–11 and thus, the molecular regulation of such specification remains unknow. Here, we studied the epigenetic landscape of single epiblast cells, which revealed lineage priming towards endoderm, ectoderm or mesoderm. Unexpectedly, epiblast cells with mesodermal priming show a strong signature for the endothelial/endocardial fate, suggesting early specification of this lineage aside from other mesoderm. Through clonal analysis and live imaging, we show that endothelial precursors show early lineage divergence from the rest of mesodermal derivatives. In particular, cardiomyocytes and endocardial cells show limited lineage relationship, despite being temporally and spatially co-recruited during gastrulation. Furthermore, analysing the live tracks of single cells through unsupervised classification of cell migratory activity, we found early behavioral divergence of endothelial precursors shortly after the onset of mesoderm migration towards the cardiogenic area. These results provide a new model for the phenotypically silent specification of mammalian cell lineages in pluripotent cells of the epiblast and modify current knowledge on the sequence and timing of cardiovascular lineages diversification.
Determining cell identities in imaging sequences is an important yet challenging task. The conventional method for cell identification is via cell tracking, which is complex and can be time-consuming. In this study, we propose an innovative approach to cell identification during early C. elegans embryogenesis using machine learning. We employed random forest, MLP, and LSTM models, and tested cell classification accuracy on 3D time-lapse confocal datasets spanning the first 4 hours of embryogenesis. By leveraging a small number of spatial-temporal features of individual cells, including cell trajectory and cell fate information, our models achieve an accuracy of over 90%, even with limited data. We also determine the most important feature contributions and can interpret these features in the context of biological knowledge. Our research demonstrates the success of predicting cell identities in 4D imaging sequences directly from simple spatio-temporal features.
The visual allure of microscopy makes it an intuitively powerful research tool. Intuition, however, can easily obscure or distort the reality of the information contained in an image. Common cognitive biases, combined with institutional pressures that reward positive research results, can quickly skew a microscopy project towards upholding, rather than rigorously challenging, a hypothesis. The impact of these biases on a variety of research topics is well known. What might be less appreciated are the many forms in which bias can permeate a microscopy experiment. Even well-intentioned researchers are susceptible to bias, which must therefore be actively recognized to be mitigated. Importantly, although image quantification has increasingly become an expectation, ostensibly to confront subtle biases, it is not a guarantee against bias and cannot alone shield an experiment from cognitive distortions. Here, we provide illustrative examples of the insidiously pervasive nature of bias in microscopy experiments - from initial experimental design to image acquisition, analysis and data interpretation. We then provide suggestions that can serve as guard rails against bias.
Background Many Drosophila species use acoustic communication during courtship and studies of these communication systems have provided insight into neurobiology, behavioral ecology, ethology, and evolution. Recording Drosophila courtship sounds and associated behavior is challenging, especially at high throughput, and previously designed devices are relatively expensive and complex to assemble. Results We present construction plans for a modular system utilizing mostly off-the-shelf, relatively inexpensive components that provides simultaneous high-resolution audio and video recording of 96 isolated or paired Drosophila individuals. We provide open-source control software to record audio and video. We designed high intensity LED arrays that can be used to perform optogenetic activation and inactivation of labelled neurons. The basic design can be modified to facilitate novel study designs or to record insects larger than Drosophila. Fewer than 96 microphones can be used in the system if the full array is not required or to reduce costs. Implications Our hardware design and software provide an improved platform for reliable and comparatively inexpensive high-throughput recording of Drosophila courtship acoustic and visual behavior and perhaps for recording acoustic signals of other small animals.
Chemotactic bacteria not only navigate chemical gradients, but also shape their environments by consuming and secreting attractants. Investigating how these processes influence the dynamics of bacterial populations has been challenging because of a lack of experimental methods for measuring spatial profiles of chemoattractants in real time. Here, we use a fluorescent sensor for aspartate to directly measure bacterially generated chemoattractant gradients during collective migration. Our measurements show that the standard Patlak-Keller-Segel model for collective chemotactic bacterial migration breaks down at high cell densities. To address this, we propose modifications to the model that consider the impact of cell density on bacterial chemotaxis and attractant consumption. With these changes, the model explains our experimental data across all cell densities, offering new insight into chemotactic dynamics. Our findings highlight the significance of considering cell density effects on bacterial behavior, and the potential for fluorescent metabolite sensors to shed light on the complex emergent dynamics of bacterial communities.
The brain generates diverse neuron types which express unique homeodomain transcription factors (TFs) and assemble into precise neural circuits. Yet a mechanistic framework is lacking for how homeodomain TFs specify both neuronal fate and synaptic connectivity. We use Drosophila lamina neurons (L1-L5) to show the homeodomain TF Brain-specific homeobox (Bsh) is initiated in lamina precursor cells (LPCs) where it specifies L4/L5 fate and suppresses homeodomain TF Zfh1 to prevent L1/L3 fate. Subsequently, Bsh activates the homeodomain TF Apterous (Ap) in L4 in a feedforward loop to express the synapse recognition molecule DIP-β, in part by Bsh direct binding a DIP-β intron. Thus, homeodomain TFs function hierarchically: primary homeodomain TF (Bsh) first specifies neuronal fate, and subsequently acts with secondary homeodomain TF (Ap) to activate DIP-β, thereby generating precise synaptic connectivity. We speculate that hierarchical homeodomain TF function may represent a general principle for coordinating neuronal fate specification and circuit assembly.
The reconstruction of neural circuits from serial section electron microscopy (ssEM) images is being accelerated by automatic image segmentation methods. Segmentation accuracy is often limited by the preceding step of aligning 2D section images to create a 3D image stack. Precise and robust alignment in the presence of image artifacts is challenging, especially as datasets are attaining the petascale. We present a computational pipeline for aligning ssEM images with several key elements. Self-supervised convolutional nets are trained via metric learning to encode and align image pairs, and they are used to initialize iterative fine-tuning of alignment. A procedure called vector voting increases robustness to image artifacts or missing image data. For speedup the series is divided into blocks that are distributed to computational workers for alignment. The blocks are aligned to each other by composing transformations with decay, which achieves a global alignment without resorting to a time-consuming global optimization. We apply our pipeline to a whole fly brain dataset, and show improved accuracy relative to prior state of the art. We also demonstrate that our pipeline scales to a cubic millimeter of mouse visual cortex. Our pipeline is publicly available through two open source Python packages.
Light sheet microscopy is a powerful technique for visualizing dynamic biological processes in 3D. Studying large specimens or recording time series with high spatial and temporal resolution generates large datasets, often exceeding terabytes and potentially reaching petabytes in size. Handling these massive datasets is challenging for conventional data processing tools with their memory and performance limitations. To overcome these issues, we developed LLSM5DTools, a software solution specifically designed for the efficient management of petabyte-scale light sheet microscopy data. This toolkit, optimized for memory and per-formance, features fast image readers and writers, efficient geometric transformations, high-performance Richardson-Lucy deconvolution, and scalable Zarr-based stitching. These advancements enable LLSM5DTools to perform over ten times faster than current state-of-the-art methods, facilitating real-time processing of large datasets and opening new avenues for biological discoveries in large-scale imaging experiments.