Sensory systems encode information about the external world, and motor systems generate movements, but how do the two systems communicate to generate sensory-guided behavior? I record the electrical activity from fly motor neurons during visual stimulation to determine how the motor neurons decode their visual system inputs and generate appropriate behavior. By working in the fly I can draw on the strong Drosophila genetic toolkit that allows me to also manipulate the activity of specific neurons during my experiments.
Considerable scientific effort has gone into understanding how fly visual neurons respond to and encode sensory inputs. Less is known about how the responses of these neurons are used to guide movements of the fly. In the specific case of fly gaze-stabilization behavior, the relevant motor neurons drive muscles that move the head to keep the eyes level. These motor neurons receive direct synaptic inputs from visual neurons. The comparative simplicity of the circuit provides an exciting opportunity to study how motor neurons process their visual system inputs. I perform patch clamp and extracellular recordings from these motor neurons while presenting visual stimuli to determine the algorithms that the motor neurons use to extract appropriate information from the visual system. Studying these questions in Drosophila gives me access to the ever-improving repertoire of genetic tools that allow genetic manipulations to be targeted to specific single neuron types. I use such techniques to manipulate upstream sensory neurons during my motor neuron recordings and behavioral experiments to determine the biological mechanisms that underlie the responses I see.
By studying this system I hope to not only understand how this part of the fly nervous system works, but also to uncover general principles applicable to understanding how the motor and sensory systems of all animals interact at the neural level to generate behavior.
Sensorimotor integration is a field rich in theory backed by a large body of psychophysical evidence. Relating the underlying neural circuitry to these theories has, however, been more challenging. With a wide array of complex behaviors coordinated by their small brains, insects provide powerful model systems to study key features of sensorimotor integration at a mechanistic level. Insect neural circuits perform both hard-wired and learned sensorimotor transformations. They modulate their neural processing based on both internal variables, such as the animal's behavioral state, and external ones, such as the time of day. Here we present some studies using insect model systems that have produced insights, at the level of individual neurons, about sensorimotor integration and the various ways in which it can be modified by context.
Prior Publications (2)
Nonlinear integration of visual and haltere inputs in fly neck motor neurons.The Journal of Neuroscience : The Official Journal of the Society for Neuroscience 2009
S. J. Huston, and H. G. Krapp The Journal of Neuroscience : The Official Journal of the Society for Neuroscience, 29:13097-105 (2009)
Animals use information from multiple sensory organs to generate appropriate behavior. Exactly how these different sensory inputs are fused at the motor system is not well understood. Here we study how fly neck motor neurons integrate information from two well characterized sensory systems: visual information from the compound eye and gyroscopic information from the mechanosensory halteres. Extracellular recordings reveal that a subpopulation of neck motor neurons display "gating-like" behavior: they do not fire action potentials in response to visual stimuli alone but will do so if the halteres are coactivated. Intracellular recordings show that these motor neurons receive small, sustained subthreshold visual inputs in addition to larger inputs that are phase locked to haltere movements. Our results suggest that the nonlinear gating-like effect results from summation of these two inputs with the action potential threshold providing the nonlinearity. As a result of this summation, the sustained visual depolarization is transformed into a temporally structured train of action potentials synchronized to the haltere beating movements. This simple mechanism efficiently fuses two different sensory signals and may also explain the context-dependent effects of visual inputs on fly behavior.
For sensory signals to control an animal's behavior, they must first be transformed into a format appropriate for use by its motor systems. This fundamental problem is faced by all animals, including humans. Beyond simple reflexes, little is known about how such sensorimotor transformations take place. Here we describe how the outputs of a well-characterized population of fly visual interneurons, lobula plate tangential cells (LPTCs), are used by the animal's gaze-stabilizing neck motor system. The LPTCs respond to visual input arising from both self-rotations and translations of the fly. The neck motor system however is involved in gaze stabilization and thus mainly controls compensatory head rotations. We investigated how the neck motor system is able to selectively extract rotation information from the mixed responses of the LPTCs. We recorded extracellularly from fly neck motor neurons (NMNs) and mapped the directional preferences across their extended visual receptive fields. Our results suggest that-like the tangential cells-NMNs are tuned to panoramic retinal image shifts, or optic flow fields, which occur when the fly rotates about particular body axes. In many cases, tangential cells and motor neurons appear to be tuned to similar axes of rotation, resulting in a correlation between the coordinate systems the two neural populations employ. However, in contrast to the primarily monocular receptive fields of the tangential cells, most NMNs are sensitive to visual motion presented to either eye. This results in the NMNs being more selective for rotation than the LPTCs. Thus, the neck motor system increases its rotation selectivity by a comparatively simple mechanism: the integration of binocular visual motion information.