Main Menu (Mobile)- Block

Main Menu - Block

janelia7_blocks-janelia7_secondary_menu | block
janelia7_blocks-janelia7_fake_breadcrumb | block
Reiser Lab / Publications
custom | custom

Filter

facetapi-Q2b17qCsTdECvJIqZJgYMaGsr8vANl1n | block
facetapi-W9JlIB1X0bjs93n1Alu3wHJQTTgDCBGe | block
facetapi-PV5lg7xuz68EAY8eakJzrcmwtdGEnxR0 | block
facetapi-021SKYQnqXW6ODq5W5dPAFEDBaEJubhN | block

Type of Publication

general_search_page-panel_pane_1 | views_panes

3 Publications

Showing 1-3 of 3 results
Your Criteria:
    03/14/24 | Whole-body simulation of realistic fruit fly locomotion with deep reinforcement learning
    Roman Vaxenburg , Igor Siwanowicz , Josh Merel , Alice A Robie , Carmen Morrow , Guido Novati , Zinovia Stefanidi , Gwyneth M Card , Michael B Reiser , Matthew M Botvinick , Kristin M Branson , Yuval Tassa , Srinivas C Turaga
    bioRxiv. 2024-03-14:. doi: 10.1101/2024.03.11.584515

    The body of an animal determines how the nervous system produces behavior. Therefore, detailed modeling of the neural control of sensorimotor behavior requires a detailed model of the body. Here we contribute an anatomically-detailed biomechanical whole-body model of the fruit fly Drosophila melanogaster in the MuJoCo physics engine. Our model is general-purpose, enabling the simulation of diverse fly behaviors, both on land and in the air. We demonstrate the generality of our model by simulating realistic locomotion, both flight and walking. To support these behaviors, we have extended MuJoCo with phenomenological models of fluid forces and adhesion forces. Through data-driven end-to-end reinforcement learning, we demonstrate that these advances enable the training of neural network controllers capable of realistic locomotion along complex trajectories based on high-level steering control signals. With a visually guided flight task, we demonstrate a neural controller that can use the vision sensors of the body model to control and steer flight. Our project is an open-source platform for modeling neural control of sensorimotor behavior in an embodied context.Competing Interest StatementThe authors have declared no competing interest.

    View Publication Page
    06/22/23 | Small-field visual projection neurons detect translational optic flow and support walking control
    Mathew D. Isaacson , Jessica L. M. Eliason , Aljoscha Nern , Edward M. Rogers , Gus K. Lott , Tanya Tabachnik , William J. Rowell , Austin W. Edwards , Wyatt L. Korff , Gerald M. Rubin , Kristin Branson , Michael B. Reiser
    bioRxiv. 2023 Jun 22:. doi: 10.1101/2023.06.21.546024

    Animals rely on visual motion for navigating the world, and research in flies has clarified how neural circuits extract information from moving visual scenes. However, the major pathways connecting these patterns of optic flow to behavior remain poorly understood. Using a high-throughput quantitative assay of visually guided behaviors and genetic neuronal silencing, we discovered a region in Drosophila’s protocerebrum critical for visual motion following. We used neuronal silencing, calcium imaging, and optogenetics to identify a single cell type, LPC1, that innervates this region, detects translational optic flow, and plays a key role in regulating forward walking. Moreover, the population of LPC1s can estimate the travelling direction, such as when gaze direction diverges from body heading. By linking specific cell types and their visual computations to specific behaviors, our findings establish a foundation for understanding how the nervous system uses vision to guide navigation.

    View Publication Page
    07/13/17 | Mapping the neural substrates of behavior.
    Robie AA, Hirokawa J, Edwards AW, Umayam LA, Lee A, Phillips ML, Card GM, Korff W, Rubin GM, Simpson JH, Reiser MB, Branson KM
    Cell. 2017-07-13;170(2):393-406. doi: 10.1016/j.cell.2017.06.032

    Assigning behavioral functions to neural structures has long been a central goal in neuroscience and is a necessary first step toward a circuit-level understanding of how the brain generates behavior. Here, we map the neural substrates of locomotion and social behaviors for Drosophila melanogaster using automated machine-vision and machine-learning techniques. From videos of 400,000 flies, we quantified the behavioral effects of activating 2,204 genetically targeted populations of neurons. We combined a novel quantification of anatomy with our behavioral analysis to create brain-behavior correlation maps, which are shared as browsable web pages and interactive software. Based on these maps, we generated hypotheses of regions of the brain causally related to sensory processing, locomotor control, courtship, aggression, and sleep. Our maps directly specify genetic tools to target these regions, which we used to identify a small population of neurons with a role in the control of walking.

    •We developed machine-vision methods to broadly and precisely quantify fly behavior•We measured effects of activating 2,204 genetically targeted neuronal populations•We created whole-brain maps of neural substrates of locomotor and social behaviors•We created resources for exploring our results and enabling further investigation

    Machine-vision analyses of large behavior and neuroanatomy data reveal whole-brain maps of regions associated with numerous complex behaviors.

    View Publication Page