Main Menu (Mobile)- Block

Main Menu - Block

janelia7_blocks-janelia7_fake_breadcrumb | block
Lee Tzumin Lab / Publications
custom | custom

Filter

facetapi-Q2b17qCsTdECvJIqZJgYMaGsr8vANl1n | block
facetapi-W9JlIB1X0bjs93n1Alu3wHJQTTgDCBGe | block
facetapi-PV5lg7xuz68EAY8eakJzrcmwtdGEnxR0 | block
facetapi-021SKYQnqXW6ODq5W5dPAFEDBaEJubhN | block
general_search_page-panel_pane_1 | views_panes

4 Publications

Showing 1-4 of 4 results
Your Criteria:
    03/20/24 | Motor neurons generate pose-targeted movements via proprioceptive sculpting.
    Gorko B, Siwanowicz I, Close K, Christoforou C, Hibbard KL, Kabra M, Lee A, Park J, Li SY, Chen AB, Namiki S, Chen C, Tuthill JC, Bock DD, Rouault H, Branson K, Ihrke G, Huston SJ
    Nature. 2024 Mar 20:. doi: 10.1038/s41586-024-07222-5

    Motor neurons are the final common pathway through which the brain controls movement of the body, forming the basic elements from which all movement is composed. Yet how a single motor neuron contributes to control during natural movement remains unclear. Here we anatomically and functionally characterize the individual roles of the motor neurons that control head movement in the fly, Drosophila melanogaster. Counterintuitively, we find that activity in a single motor neuron rotates the head in different directions, depending on the starting posture of the head, such that the head converges towards a pose determined by the identity of the stimulated motor neuron. A feedback model predicts that this convergent behaviour results from motor neuron drive interacting with proprioceptive feedback. We identify and genetically suppress a single class of proprioceptive neuron that changes the motor neuron-induced convergence as predicted by the feedback model. These data suggest a framework for how the brain controls movements: instead of directly generating movement in a given direction by activating a fixed set of motor neurons, the brain controls movements by adding bias to a continuing proprioceptive-motor loop.

    View Publication Page
    11/20/24 | Social state gates vision using three circuit mechanisms in Drosophila
    Catherine E. Schretter , Tom Hindmarsh Sten , Nathan Klapoetke , Mei Shao , Aljoscha Nern , Marisa Dreher , Daniel Bushey , Alice A. Robie , Adam L. Taylor , Kristin M. Branson , Adriane Otopalik , Vanessa Ruta , Gerald M. Rubin
    Nature. 2024 Nov 20:. doi: 10.1038/s41586-024-08255-6

    Animals are often bombarded with visual information and must prioritize specific visual features based on their current needs. The neuronal circuits that detect and relay visual features have been well studied. Much less is known about how an animal adjusts its visual attention as its goals or environmental conditions change. During social behaviours, flies need to focus on nearby flies. Here we study how the flow of visual information is altered when female Drosophila enter an aggressive state. From the connectome, we identify three state-dependent circuit motifs poised to modify the response of an aggressive female to fly-sized visual objects: convergence of excitatory inputs from neurons conveying select visual features and internal state; dendritic disinhibition of select visual feature detectors; and a switch that toggles between two visual feature detectors. Using cell-type-specific genetic tools, together with behavioural and neurophysiological analyses, we show that each of these circuit motifs is used during female aggression. We reveal that features of this same switch operate in male Drosophila during courtship pursuit, suggesting that disparate social behaviours may share circuit mechanisms. Our study provides a compelling example of using the connectome to infer circuit mechanisms that underlie dynamic processing of sensory signals.

    View Publication Page
    11/04/24 | The Fly Disco: Hardware and software for optogenetics and fine-grained fly behavior analysis
    Robie AA, Taylor AL, Schretter CE, Kabra M, Branson K
    bioRxiv. 2024 Nov 04:. doi: 10.1101/2024.11.04.621948

    In the fruit fly, Drosophila melanogaster, connectome data and genetic tools provide a unique opportunity to study complex behaviors including navigation, mating, aggression, and grooming in an organism with a tractable nervous system of 140,000 neurons. Here we present the Fly Disco, a flexible system for high quality video collection, optogenetic manipulation, and fine-grained behavioral analysis of freely walking and socializing fruit fly groups. The data collection hardware and software automates the collection of videos synced to programmable optogenetic stimuli. Key pipeline features include behavioral analysis based on trajectories of 21 keypoints and optogenetic-specific summary statistics and data visualization. We created the multifly dataset for pose estimation that includes 9701 examples enriched in complex behaviors. All hardware designs, software, and the multifly dataset are freely available.

    View Publication Page
    11/22/24 | Whole-body simulation of realistic fruit fly locomotion with deep reinforcement learning
    Vaxenburg R, Siwanowicz I, Merel J, Robie AA, Morrow C, Novati G, Stefanidi Z, Both G, Card GM, Reiser MB, Botvinick MM, Branson KM, Tassa Y, Turaga SC
    bioRxiv. 2024 Nov 22:. doi: 10.1101/2024.03.11.584515

    The body of an animal influences how the nervous system produces behavior. Therefore, detailed modeling of the neural control of sensorimotor behavior requires a detailed model of the body. Here we contribute an anatomically-detailed biomechanical whole-body model of the fruit fly Drosophila melanogaster in the MuJoCo physics engine. Our model is general-purpose, enabling the simulation of diverse fly behaviors, both on land and in the air. We demonstrate the generality of our model by simulating realistic locomotion, both flight and walking. To support these behaviors, we have extended MuJoCo with phenomenological models of fluid forces and adhesion forces. Through data-driven end-to-end reinforcement learning, we demonstrate that these advances enable the training of neural network controllers capable of realistic locomotion along complex trajectories based on high-level steering control signals. We demonstrate the use of visual sensors and the re-use of a pre-trained general-purpose flight controller by training the model to perform visually guided flight tasks. Our project is an open-source platform for modeling neural control of sensorimotor behavior in an embodied context.Competing Interest StatementThe authors have declared no competing interest.

    View Publication Page