Main Menu (Mobile)- Block

Main Menu - Block

custom | custom

Search Results

filters_region_cap | custom

Filter

facetapi-Q2b17qCsTdECvJIqZJgYMaGsr8vANl1n | block
facetapi-W9JlIB1X0bjs93n1Alu3wHJQTTgDCBGe | block

Associated Project Team

facetapi-61yz1V0li8B1bixrCWxdAe2aYiEXdhd0 | block
facetapi-PV5lg7xuz68EAY8eakJzrcmwtdGEnxR0 | block
general_search_page-panel_pane_1 | views_panes

2 Janelia Publications

Showing 1-2 of 2 results
Your Criteria:
    11/01/21 | Whole-cell organelle segmentation in volume electron microscopy.
    Heinrich L, Bennett D, Ackerman D, Park W, Bogovic J, Eckstein N, Petruncio A, Clements J, Pang S, Xu CS, Funke J, Korff W, Hess HF, Lippincott-Schwartz J, Saalfeld S, Weigel AV, COSEM Project Team
    Nature. 2021 Nov 01;599(7883):141-46. doi: 10.1038/s41586-021-03977-3

    Cells contain hundreds of organelles and macromolecular assemblies. Obtaining a complete understanding of their intricate organization requires the nanometre-level, three-dimensional reconstruction of whole cells, which is only feasible with robust and scalable automatic methods. Here, to support the development of such methods, we annotated up to 35 different cellular organelle classes-ranging from endoplasmic reticulum to microtubules to ribosomes-in diverse sample volumes from multiple cell types imaged at a near-isotropic resolution of 4 nm per voxel with focused ion beam scanning electron microscopy (FIB-SEM). We trained deep learning architectures to segment these structures in 4 nm and 8 nm per voxel FIB-SEM volumes, validated their performance and showed that automatic reconstructions can be used to directly quantify previously inaccessible metrics including spatial interactions between cellular components. We also show that such reconstructions can be used to automatically register light and electron microscopy images for correlative studies. We have created an open data and open-source web repository, 'OpenOrganelle', to share the data, computer code and trained models, which will enable scientists everywhere to query and further improve automatic reconstruction of these datasets.

    View Publication Page
    07/01/21 | Automatic Detection of Synaptic Partners in a Whole-Brain Drosophila EM Dataset
    Buhmann J, Sheridan A, Gerhard S, Krause R, Nguyen T, Heinrich L, Schlegel P, Lee WA, Wilson R, Saalfeld S, Jefferis G, Bock D, Turaga S, Cook M, Funke J
    Nature Methods. 2021 Jul 1;18(7):771-4. doi: 10.1038/s41592-021-01183-7

    The study of neural circuits requires the reconstruction of neurons and the identification of synaptic connections between them. To scale the reconstruction to the size of whole-brain datasets, semi-automatic methods are needed to solve those tasks. Here, we present an automatic method for synaptic partner identification in insect brains, which uses convolutional neural networks to identify post-synaptic sites and their pre-synaptic partners. The networks can be trained from human generated point annotations alone and requires only simple post-processing to obtain final predictions. We used our method to extract 244 million putative synaptic partners in the fifty-teravoxel full adult fly brain (FAFB) electron microscopy (EM) dataset and evaluated its accuracy on 146,643 synapses from 702 neurons with a total cable length of 312 mm in four different brain regions. The predicted synaptic connections can be used together with a neuron segmentation to infer a connectivity graph with high accuracy: 96% of edges between connected neurons are correctly classified as weakly connected (less than five synapses) and strongly connected (at least five synapses). Our synaptic partner predictions for the FAFB dataset are publicly available, together with a query library allowing automatic retrieval of up- and downstream neurons.

    View Publication Page