Filter
Associated Lab
- Ahrens Lab (1) Apply Ahrens Lab filter
- Darshan Lab (1) Apply Darshan Lab filter
- Dickson Lab (1) Apply Dickson Lab filter
- Remove Fitzgerald Lab filter Fitzgerald Lab
- Funke Lab (1) Apply Funke Lab filter
- Romani Lab (1) Apply Romani Lab filter
- Simpson Lab (1) Apply Simpson Lab filter
- Spruston Lab (2) Apply Spruston Lab filter
- Stringer Lab (1) Apply Stringer Lab filter
- Turner Lab (2) Apply Turner Lab filter
Associated Project Team
Publication Date
- 2025 (1) Apply 2025 filter
- 2024 (2) Apply 2024 filter
- 2023 (4) Apply 2023 filter
- 2022 (3) Apply 2022 filter
- 2020 (3) Apply 2020 filter
- 2019 (2) Apply 2019 filter
- 2018 (2) Apply 2018 filter
- 2016 (1) Apply 2016 filter
- 2015 (3) Apply 2015 filter
- 2014 (1) Apply 2014 filter
- 2013 (1) Apply 2013 filter
- 2012 (1) Apply 2012 filter
- 2011 (1) Apply 2011 filter
- 2009 (1) Apply 2009 filter
- 2008 (1) Apply 2008 filter
- 2007 (2) Apply 2007 filter
Type of Publication
29 Publications
Showing 11-20 of 29 resultsModern recording techniques now permit brain-wide sensorimotor circuits to be observed at single neuron resolution in small animals. Extracting theoretical understanding from these recordings requires principles that organize findings and guide future experiments. Here we review theoretical principles that shed light onto brain-wide sensorimotor processing. We begin with an analogy that conceptualizes principles as streetlamps that illuminate the empirical terrain, and we illustrate the analogy by showing how two familiar principles apply in new ways to brain-wide phenomena. We then focus the bulk of the review on describing three more principles that have wide utility for mapping brain-wide neural activity, making testable predictions from highly parameterized mechanistic models, and investigating the computational determinants of neuronal response patterns across the brain.
All animals must transform ambiguous sensory data into successful behavior. This requires sensory representations that accurately reflect the statistics of natural stimuli and behavior. Multiple studies show that visual motion processing is tuned for accuracy under naturalistic conditions, but the sensorimotor circuits extracting these cues and implementing motion-guided behavior remain unclear. Here we show that the larval zebrafish retina extracts a diversity of naturalistic motion cues, and the retinorecipient pretectum organizes these cues around the elements of behavior. We find that higher-order motion stimuli, gliders, induce optomotor behavior matching expectations from natural scene analyses. We then image activity of retinal ganglion cell terminals and pretectal neurons. The retina exhibits direction-selective responses across glider stimuli, and anatomically clustered pretectal neurons respond with magnitudes matching behavior. Peripheral computations thus reflect natural input statistics, whereas central brain activity precisely codes information needed for behavior. This general principle could organize sensorimotor transformations across animal species.
Breakthrough technologies for monitoring and manipulating single-neuron activity provide unprecedented opportunities for whole-brain neuroscience in larval zebrafish1–9. Understanding the neural mechanisms of visually guided behavior also requires precise stimulus control, but little prior research has accounted for physical distortions that result from refraction and reflection at an air-water interface that usually separates the projected stimulus from the fish10–12. Here we provide a computational tool that transforms between projected and received stimuli in order to detect and control these distortions. The tool considers the most commonly encountered interface geometry, and we show that this and other common configurations produce stereotyped distortions. By correcting these distortions, we reduced discrepancies in the literature concerning stimuli that evoke escape behavior13,14, and we expect this tool will help reconcile other confusing aspects of the literature. This tool also aids experimental design, and we illustrate the dangers that uncorrected stimuli pose to receptive field mapping experiments.
Animals detect motion using a variety of visual cues that reflect regularities in the natural world. Experiments in animals across phyla have shown that motion percepts incorporate both pairwise and triplet spatiotemporal correlations that could theoretically benefit motion computation. However, it remains unclear how visual systems assemble these cues to build accurate motion estimates. Here we used systematic behavioral measurements of fruit fly motion perception to show how flies combine local pairwise and triplet correlations to reduce variability in motion estimates across natural scenes. By generating synthetic images with statistics controlled by maximum entropy distributions, we show that the triplet correlations are useful only when images have light-dark asymmetries that mimic natural ones. This suggests that asymmetric ON-OFF processing is tuned to the particular statistics of natural scenes. Since all animals encounter the world's light-dark asymmetries, many visual systems are likely to use asymmetric ON-OFF processing to improve motion estimation.
Goal-directed animal behaviors are typically composed of sequences of motor actions whose order and timing are critical for a successful outcome. Although numerous theoretical models for sequential action generation have been proposed, few have been supported by the identification of control neurons sufficient to elicit a sequence. Here, we identify a pair of descending neurons that coordinate a stereotyped sequence of engagement actions during Drosophila melanogaster male courtship behavior. These actions are initiated sequentially but persist cumulatively, a feature not explained by existing models of sequential behaviors. We find evidence consistent with a ramp-to-threshold mechanism, in which increasing neuronal activity elicits each action independently at successively higher activity thresholds.
Both vertebrates and invertebrates perceive illusory motion, known as "reverse-phi," in visual stimuli that contain sequential luminance increments and decrements. However, increment (ON) and decrement (OFF) signals are initially processed by separate visual neurons, and parallel elementary motion detectors downstream respond selectively to the motion of light or dark edges, often termed ON- and OFF-edges. It remains unknown how and where ON and OFF signals combine to generate reverse-phi motion signals. Here, we show that each of Drosophila's elementary motion detectors encodes motion by combining both ON and OFF signals. Their pattern of responses reflects combinations of increments and decrements that co-occur in natural motion, serving to decorrelate their outputs. These results suggest that the general principle of signal decorrelation drives the functional specialization of parallel motion detection channels, including their selectivity for moving light or dark edges.
Neural network remodeling underpins the ability to remember life experiences, but little is known about the long-term plasticity of neural populations. To study how the brain encodes episodic events, we used time-lapse two-photon microscopy and a fluorescent reporter of neural plasticity based on an enhanced form of the synaptic activity-responsive element (E-SARE) within the Arc promoter to track thousands of CA1 hippocampal pyramidal cells over weeks in mice that repeatedly encountered different environments. Each environment evokes characteristic patterns of ensemble neural plasticity, but with each encounter, the set of activated cells gradually evolves. After repeated exposures, the plasticity patterns evoked by an individual environment progressively stabilize. Compared with young adults, plasticity patterns in aged mice are less specific to individual environments and less stable across repeat experiences. Long-term consolidation of hippocampal plasticity patterns may support long-term memory formation, whereas weaker consolidation in aged subjects might reflect declining memory function.
Detailed descriptions of brain-scale sensorimotor circuits underlying vertebrate behavior remain elusive. Recent advances in zebrafish neuroscience offer new opportunities to dissect such circuits via whole-brain imaging, behavioral analysis, functional perturbations, and network modeling. Here, we harness these tools to generate a brain-scale circuit model of the optomotor response, an orienting behavior evoked by visual motion. We show that such motion is processed by diverse neural response types distributed across multiple brain regions. To transform sensory input into action, these regions sequentially integrate eye- and direction-specific sensory streams, refine representations via interhemispheric inhibition, and demix locomotor instructions to independently drive turning and forward swimming. While experiments revealed many neural response types throughout the brain, modeling identified the dimensions of functional connectivity most critical for the behavior. We thus reveal how distributed neurons collaborate to generate behavior and illustrate a paradigm for distilling functional circuit models from whole-brain data.
Many animals use visual signals to estimate motion. Canonical models suppose that animals estimate motion by cross-correlating pairs of spatiotemporally separated visual signals, but recent experiments indicate that humans and flies perceive motion from higher-order correlations that signify motion in natural environments. Here we show how biologically plausible processing motifs in neural circuits could be tuned to extract this information. We emphasize how known aspects of Drosophila's visual circuitry could embody this tuning and predict fly behavior. We find that segregating motion signals into ON/OFF channels can enhance estimation accuracy by accounting for natural light/dark asymmetries. Furthermore, a diversity of inputs to motion detecting neurons can provide access to more complex higher-order correlations. Collectively, these results illustrate how non-canonical computations improve motion estimation with naturalistic inputs. This argues that the complexity of the fly's motion computations, implemented in its elaborate circuits, represents a valuable feature of its visual motion estimator.
In order to localize the neural circuits involved in generating behaviors, it is necessary to assign activity onto anatomical maps of the nervous system. Using brain registration across hundreds of larval zebrafish, we have built an expandable open-source atlas containing molecular labels and definitions of anatomical regions, the Z-Brain. Using this platform and immunohistochemical detection of phosphorylated extracellular signal–regulated kinase (ERK) as a readout of neural activity, we have developed a system to create and contextualize whole-brain maps of stimulus- and behavior-dependent neural activity. This mitogen-activated protein kinase (MAP)-mapping assay is technically simple, and data analysis is completely automated. Because MAP-mapping is performed on freely swimming fish, it is applicable to studies of nearly any stimulus or behavior. Here we demonstrate our high-throughput approach using pharmacological, visual and noxious stimuli, as well as hunting and feeding. The resultant maps outline hundreds of areas associated with behaviors.