Filter
Associated Lab
Associated Project Team
Publication Date
Type of Publication
4 Publications
Showing 1-4 of 4 resultsThe body of an animal influences how its nervous system generates behavior1. Accurately modeling the neural control of sensorimotor behavior requires an anatomically detailed biomechanical representation of the body. Here, we introduce a whole-body model of the fruit fly Drosophila melanogaster in a physics simulator. Designed as a general-purpose framework, our model enables the simulation of diverse fly behaviors, including both terrestrial and aerial locomotion. We validate its versatility by replicating realistic walking and flight behaviors. To support these behaviors, we develop new phenomenological models for fluid and adhesion forces. Using data-driven, end-to-end reinforcement learning we train neural network controllers capable of generating naturalistic locomotion along complex trajectories in response to high-level steering commands. Additionally, we show the use of visual sensors and hierarchical motor control, training a high-level controller to reuse a pre-trained low-level flight controller to perform visually guided flight tasks. Our model serves as an open-source platform for studying the neural control of sensorimotor behavior in an embodied context. Preprint: www.biorxiv.org/content/early/2024/03/14/2024.03.11.584515
The body of an animal influences how the nervous system produces behavior. Therefore, detailed modeling of the neural control of sensorimotor behavior requires a detailed model of the body. Here we contribute an anatomically-detailed biomechanical whole-body model of the fruit fly Drosophila melanogaster in the MuJoCo physics engine. Our model is general-purpose, enabling the simulation of diverse fly behaviors, both on land and in the air. We demonstrate the generality of our model by simulating realistic locomotion, both flight and walking. To support these behaviors, we have extended MuJoCo with phenomenological models of fluid forces and adhesion forces. Through data-driven end-to-end reinforcement learning, we demonstrate that these advances enable the training of neural network controllers capable of realistic locomotion along complex trajectories based on high-level steering control signals. We demonstrate the use of visual sensors and the re-use of a pre-trained general-purpose flight controller by training the model to perform visually guided flight tasks. Our project is an open-source platform for modeling neural control of sensorimotor behavior in an embodied context.Competing Interest StatementThe authors have declared no competing interest.
Animals rely on visual motion for navigating the world, and research in flies has clarified how neural circuits extract information from moving visual scenes. However, the major pathways connecting these patterns of optic flow to behavior remain poorly understood. Using a high-throughput quantitative assay of visually guided behaviors and genetic neuronal silencing, we discovered a region in Drosophila’s protocerebrum critical for visual motion following. We used neuronal silencing, calcium imaging, and optogenetics to identify a single cell type, LPC1, that innervates this region, detects translational optic flow, and plays a key role in regulating forward walking. Moreover, the population of LPC1s can estimate the travelling direction, such as when gaze direction diverges from body heading. By linking specific cell types and their visual computations to specific behaviors, our findings establish a foundation for understanding how the nervous system uses vision to guide navigation.
Assigning behavioral functions to neural structures has long been a central goal in neuroscience and is a necessary first step toward a circuit-level understanding of how the brain generates behavior. Here, we map the neural substrates of locomotion and social behaviors for Drosophila melanogaster using automated machine-vision and machine-learning techniques. From videos of 400,000 flies, we quantified the behavioral effects of activating 2,204 genetically targeted populations of neurons. We combined a novel quantification of anatomy with our behavioral analysis to create brain-behavior correlation maps, which are shared as browsable web pages and interactive software. Based on these maps, we generated hypotheses of regions of the brain causally related to sensory processing, locomotor control, courtship, aggression, and sleep. Our maps directly specify genetic tools to target these regions, which we used to identify a small population of neurons with a role in the control of walking. •We developed machine-vision methods to broadly and precisely quantify fly behavior•We measured effects of activating 2,204 genetically targeted neuronal populations•We created whole-brain maps of neural substrates of locomotor and social behaviors•We created resources for exploring our results and enabling further investigation Machine-vision analyses of large behavior and neuroanatomy data reveal whole-brain maps of regions associated with numerous complex behaviors.