Main Menu (Mobile)- Block

Main Menu - Block

custom | custom

Search Results

general_search_page-panel_pane_1 | views_panes

5 Janelia Publications

Showing 1-5 of 5 results
Your Criteria:
    09/03/22 | The SARS-CoV-2 accessory protein Orf3a is not an ion channel, but does interact with trafficking proteins
    Alexandria N. Miller , Patrick R. Houlihan , Ella Matamala , Deny Cabezas-Bratesco , Gi Young Lee , Ben Cristofori-Armstrong , Tanya L. Dilan , Silvia Sanchez-Martinez , Doreen Matthies , Rui Yan , Zhiheng Yu , Dejian Ren , Sebastian E. Brauchi , David E. Clapham
    bioRxiv. 2022 Sep 03:. doi: 10.1101/2022.09.02.506428

    The severe acute respiratory syndrome associated coronavirus 2 (SARS-CoV-2) and SARS-CoV-1 accessory protein Orf3a colocalizes with markers of the plasma membrane, endocytic pathway, and Golgi apparatus. Some reports have led to annotation of both Orf3a proteins as a viroporin. Here we show that neither SARS-CoV-2 nor SARS-CoV-1 form functional ion conducting pores and that the conductances measured are common contaminants in overexpression and with high levels of protein in reconstitution studies. Cryo-EM structures of both SARS-CoV-2 and SARS-CoV-1 Orf3a display a narrow constriction and the presence of a basic aqueous vestibule, which would not favor cation permeation. We observe enrichment of the late endosomal marker Rab7 upon SARS-CoV-2 Orf3a overexpression, and co-immunoprecipitation with VPS39. Interestingly, SARS-CoV-1 Orf3a does not cause the same cellular phenotype as SARS-CoV-2 Orf3a and does not interact with VPS39. To explain this difference, we find that a divergent, unstructured loop of SARS-CoV-2 Orf3a facilitates its binding with VPS39, a HOPS complex tethering protein involved in late endosome and autophagosome fusion with lysosomes. We suggest that the added loop enhances SARS-CoV-2 Orf3a ability to co-opt host cellular trafficking mechanisms for viral exit or host immune evasion.

    View Publication Page
    02/01/22 | A neural circuit linking learning and sleep in Drosophila long-term memory.
    Lei Z, Henderson K, Keleman K
    Nature Communications. 2022 Feb 01;13(1):609. doi: 10.1038/s41467-022-28256-1

    Animals retain some but not all experiences in long-term memory (LTM). Sleep supports LTM retention across animal species. It is well established that learning experiences enhance post-learning sleep. However, the underlying mechanisms of how learning mediates sleep for memory retention are not clear. Drosophila males display increased amounts of sleep after courtship learning. Courtship learning depends on Mushroom Body (MB) neurons, and post-learning sleep is mediated by the sleep-promoting ventral Fan-Shaped Body neurons (vFBs). We show that post-learning sleep is regulated by two opposing output neurons (MBONs) from the MB, which encode a measure of learning. Excitatory MBONs-γ2α'1 becomes increasingly active upon increasing time of learning, whereas inhibitory MBONs-β'2mp is activated only by a short learning experience. These MB outputs are integrated by SFS neurons, which excite vFBs to promote sleep after prolonged but not short training. This circuit may ensure that only longer or more intense learning experiences induce sleep and are thereby consolidated into LTM.

    View Publication Page
    09/03/20 | A connectome of the adult drosophila central brain.
    Xu CS, Januszewski M, Lu Z, Takemura S, Hayworth KJ, Huang G, Shinomiya K, Maitin-Shepard J, Ackerman D, Berg S, Blakely T, Bogovic J, Clements J, Dolafi T, Hubbard P, Kainmueller D, Katz W, Kawase T, Khairy KA, Leavitt L, Li PH, Lindsey L, Neubarth N, Olbris DJ, Otsuna H, Troutman ET, Umayam L, Zhao T, Ito M, Goldammer J, Wolff T, Svirskas R, Schlegel P, Neace ER, Knecht CJ, Alvarado CX, Bailey DA, Ballinger S, Borycz JA, Canino BS
    eLife. 2020 Sep 03:. doi: https://doi.org/10.1101/2020.01.21.911859

    The neural circuits responsible for behavior remain largely unknown. Previous efforts have reconstructed the complete circuits of small animals, with hundreds of neurons, and selected circuits for larger animals. Here we (the FlyEM project at Janelia and collaborators at Google) summarize new methods and present the complete circuitry of a large fraction of the brain of a much more complex animal, the fruit fly Drosophila melanogaster. Improved methods include new procedures to prepare, image, align, segment, find synapses, and proofread such large data sets; new methods that define cell types based on connectivity in addition to morphology; and new methods to simplify access to a large and evolving data set. From the resulting data we derive a better definition of computational compartments and their connections; an exhaustive atlas of cell examples and types, many of them novel; detailed circuits for most of the central brain; and exploration of the statistics and structure of different brain compartments, and the brain as a whole. We make the data public, with a web site and resources specifically designed to make it easy to explore, for all levels of expertise from the expert to the merely curious. The public availability of these data, and the simplified means to access it, dramatically reduces the effort needed to answer typical circuit questions, such as the identity of upstream and downstream neural partners, the circuitry of brain regions, and to link the neurons defined by our analysis with genetic reagents that can be used to study their functions.

    Note: In the next few weeks, we will release a series of papers with more involved discussions. One paper will detail the hemibrain reconstruction with more extensive analysis and interpretation made possible by this dense connectome. Another paper will explore the central complex, a brain region involved in navigation, motor control, and sleep. A final paper will present insights from the mushroom body, a center of multimodal associative learning in the fly brain.

    View Publication Page
    01/16/20 | Cortical pattern generation during dexterous movement is input-driven.
    Sauerbrei BA, Guo J, Cohen JD, Mischiati M, Guo W, Kabra M, Verma N, Mensh B, Branson K, Hantman AW
    Nature. 2020 Jan 16;577(7790):386-91. doi: 10.1038/s41586-019-1869-9

    The motor cortex controls skilled arm movement by sending temporal patterns of activity to lower motor centres. Local cortical dynamics are thought to shape these patterns throughout movement execution. External inputs have been implicated in setting the initial state of the motor cortex, but they may also have a pattern-generating role. Here we dissect the contribution of local dynamics and inputs to cortical pattern generation during a prehension task in mice. Perturbing cortex to an aberrant state prevented movement initiation, but after the perturbation was released, cortex either bypassed the normal initial state and immediately generated the pattern that controls reaching or failed to generate this pattern. The difference in these two outcomes was probably a result of external inputs. We directly investigated the role of inputs by inactivating the thalamus; this perturbed cortical activity and disrupted limb kinematics at any stage of the movement. Activation of thalamocortical axon terminals at different frequencies disrupted cortical activity and arm movement in a graded manner. Simultaneous recordings revealed that both thalamic activity and the current state of cortex predicted changes in cortical activity. Thus, the pattern generator for dexterous arm movement is distributed across multiple, strongly interacting brain regions.

    View Publication Page
    09/30/19 | ilastik: interactive machine learning for (bio)image analysis.
    Berg S, Kutra D, Kroeger T, Straehle CN, Kausler BX, Haubold C, Schiegg M, Ales J, Beier T, Rudy M, Eren K, Cervantes JI, Xu B, Beuttenmueller F, Wolny A, Zhang C, Koethe U, Hamprecht FA, Kreshuk A
    Nature Methods. 2019 Sep 30;16:1226-32. doi: 10.1038/s41592-019-0582-9

    We present ilastik, an easy-to-use interactive tool that brings machine-learning-based (bio)image analysis to end users without substantial computational expertise. It contains pre-defined workflows for image segmentation, object classification, counting and tracking. Users adapt the workflows to the problem at hand by interactively providing sparse training annotations for a nonlinear classifier. ilastik can process data in up to five dimensions (3D, time and number of channels). Its computational back end runs operations on-demand wherever possible, allowing for interactive prediction on data larger than RAM. Once the classifiers are trained, ilastik workflows can be applied to new data from the command line without further user interaction. We describe all ilastik workflows in detail, including three case studies and a discussion on the expected performance.

    View Publication Page