Filter
Associated Lab
Associated Support Team
- Remove Project Technical Resources filter Project Technical Resources
2 Janelia Publications
Showing 1-2 of 2 resultsThe brain generates diverse neuron types which express unique homeodomain transcription factors (TFs) and assemble into precise neural circuits. Yet a mechanistic framework is lacking for how homeodomain TFs specify both neuronal fate and synaptic connectivity. We use Drosophila lamina neurons (L1-L5) to show the homeodomain TF Brain-specific homeobox (Bsh) is initiated in lamina precursor cells (LPCs) where it specifies L4/L5 fate and suppresses homeodomain TF Zfh1 to prevent L1/L3 fate. Subsequently, Bsh activates the homeodomain TF Apterous (Ap) in L4 in a feedforward loop to express the synapse recognition molecule DIP-β, in part by Bsh direct binding a DIP-β intron. Thus, homeodomain TFs function hierarchically: primary homeodomain TF (Bsh) first specifies neuronal fate, and subsequently acts with secondary homeodomain TF (Ap) to activate DIP-β, thereby generating precise synaptic connectivity. We speculate that hierarchical homeodomain TF function may represent a general principle for coordinating neuronal fate specification and circuit assembly.
The body of an animal determines how the nervous system produces behavior. Therefore, detailed modeling of the neural control of sensorimotor behavior requires a detailed model of the body. Here we contribute an anatomically-detailed biomechanical whole-body model of the fruit fly Drosophila melanogaster in the MuJoCo physics engine. Our model is general-purpose, enabling the simulation of diverse fly behaviors, both on land and in the air. We demonstrate the generality of our model by simulating realistic locomotion, both flight and walking. To support these behaviors, we have extended MuJoCo with phenomenological models of fluid forces and adhesion forces. Through data-driven end-to-end reinforcement learning, we demonstrate that these advances enable the training of neural network controllers capable of realistic locomotion along complex trajectories based on high-level steering control signals. With a visually guided flight task, we demonstrate a neural controller that can use the vision sensors of the body model to control and steer flight. Our project is an open-source platform for modeling neural control of sensorimotor behavior in an embodied context.Competing Interest StatementThe authors have declared no competing interest.