Filter
Associated Lab
- Remove Ahrens Lab filter Ahrens Lab
- Aso Lab (1) Apply Aso Lab filter
- Branson Lab (1) Apply Branson Lab filter
- Fitzgerald Lab (1) Apply Fitzgerald Lab filter
- Freeman Lab (5) Apply Freeman Lab filter
- Harris Lab (2) Apply Harris Lab filter
- Jayaraman Lab (2) Apply Jayaraman Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Keller Lab (5) Apply Keller Lab filter
- Lavis Lab (2) Apply Lavis Lab filter
- Liu (Zhe) Lab (1) Apply Liu (Zhe) Lab filter
- Looger Lab (7) Apply Looger Lab filter
- Pedram Lab (1) Apply Pedram Lab filter
- Podgorski Lab (3) Apply Podgorski Lab filter
- Schreiter Lab (4) Apply Schreiter Lab filter
- Shroff Lab (1) Apply Shroff Lab filter
- Svoboda Lab (4) Apply Svoboda Lab filter
- Turaga Lab (2) Apply Turaga Lab filter
- Turner Lab (2) Apply Turner Lab filter
- Wang (Shaohe) Lab (1) Apply Wang (Shaohe) Lab filter
- Zlatic Lab (1) Apply Zlatic Lab filter
Associated Project Team
Publication Date
- 2025 (1) Apply 2025 filter
- 2024 (9) Apply 2024 filter
- 2023 (4) Apply 2023 filter
- 2022 (4) Apply 2022 filter
- 2021 (2) Apply 2021 filter
- 2020 (4) Apply 2020 filter
- 2019 (5) Apply 2019 filter
- 2018 (4) Apply 2018 filter
- 2017 (2) Apply 2017 filter
- 2016 (7) Apply 2016 filter
- 2015 (3) Apply 2015 filter
- 2014 (3) Apply 2014 filter
- 2013 (5) Apply 2013 filter
- 2012 (1) Apply 2012 filter
- 2011 (1) Apply 2011 filter
- 2010 (1) Apply 2010 filter
- 2008 (3) Apply 2008 filter
- 2006 (2) Apply 2006 filter
Type of Publication
61 Publications
Showing 41-50 of 61 resultsEscape behaviors deliver organisms away from imminent catastrophe. Here, we characterize behavioral responses of freely swimming larval zebrafish to looming visual stimuli simulating predators. We report that the visual system alone can recruit lateralized, rapid escape motor programs, similar to those elicited by mechanosensory modalities. Two-photon calcium imaging of retino-recipient midbrain regions isolated the optic tectum as an important center processing looming stimuli, with ensemble activity encoding the critical image size determining escape latency. Furthermore, we describe activity in retinal ganglion cell terminals and superficial inhibitory interneurons in the tectum during looming and propose a model for how temporal dynamics in tectal periventricular neurons might arise from computations between these two fundamental constituents. Finally, laser ablations of hindbrain circuitry confirmed that visual and mechanosensory modalities share the same premotor output network. We establish a circuit for the processing of aversive stimuli in the context of an innate visual behavior.
We present a modular approach for analyzing calcium imaging recordings of large neuronal ensembles. Our goal is to simultaneously identify the locations of the neurons, demix spatially overlapping components, and denoise and deconvolve the spiking activity from the slow dynamics of the calcium indicator. Our approach relies on a constrained nonnegative matrix factorization that expresses the spatiotemporal fluorescence activity as the product of a spatial matrix that encodes the spatial footprint of each neuron in the optical field and a temporal matrix that characterizes the calcium concentration of each neuron over time. This framework is combined with a novel constrained deconvolution approach that extracts estimates of neural activity from fluorescence traces, to create a spatiotemporal processing algorithm that requires minimal parameter tuning. We demonstrate the general applicability of our method by applying it to in vitro and in vivo multi-neuronal imaging data, whole-brain light-sheet imaging data, and dendritic imaging data.
The dense connectivity in the brain means that one neuron's activity can influence many others. To observe this interconnected system comprehensively, an aspiration within neuroscience is to record from as many neurons as possible at the same time. There are two useful routes toward this goal: one is to expand the spatial extent of functional imaging techniques, and the second is to use animals with small brains. Here we review recent progress toward imaging many neurons and complete populations of identified neurons in small vertebrates and invertebrates.
The identification of active neurons and circuits in vivo is a fundamental challenge in understanding the neural basis of behavior. Genetically encoded calcium (Ca(2+)) indicators (GECIs) enable quantitative monitoring of cellular-resolution activity during behavior. However, such indicators require online monitoring within a limited field of view. Alternatively, post hoc staining of immediate early genes (IEGs) indicates highly active cells within the entire brain, albeit with poor temporal resolution. We designed a fluorescent sensor, CaMPARI, that combines the genetic targetability and quantitative link to neural activity of GECIs with the permanent, large-scale labeling of IEGs, allowing a temporally precise "activity snapshot" of a large tissue volume. CaMPARI undergoes efficient and irreversible green-to-red conversion only when elevated intracellular Ca(2+) and experimenter-controlled illumination coincide. We demonstrate the utility of CaMPARI in freely moving larvae of zebrafish and flies, and in head-fixed mice and adult flies.
The nature of nervous system function and development is inherently global, since all components eventually influence one another. Networks communicate through dense synaptic, electric, and modulatory connections and develop through concurrent growth and interlinking of their neurons, processes, glia, and blood vessels. These factors drive the development of techniques capable of imaging neural signaling, anatomy, and developmental processes at ever-larger scales. Here, we discuss the nature of questions benefitting from large-scale imaging techniques and introduce recent applications. We focus on emerging light-sheet microscopy approaches, which are well suited for live imaging of large systems with high spatiotemporal resolution and over long periods of time. We also discuss computational methods suitable for extracting biological information from the resulting system-level image data sets. Together with new tools for reporting and manipulating neuronal activity and gene expression, these techniques promise new insights into the large-scale function and development of neural systems.
Developments in electrical and optical recording technology are scaling up the size of neuronal populations that can be monitored simultaneously. Light-sheet imaging is rapidly gaining traction as a method for optically interrogating activity in large networks and presents both opportunities and challenges for understanding circuit function.
The processing of sensory input and the generation of behavior involves large networks of neurons, which necessitates new technology for recording from many neurons in behaving animals. In the larval zebrafish, light-sheet microscopy can be used to record the activity of almost all neurons in the brain simultaneously at single-cell resolution. Existing implementations, however, cannot be combined with visually driven behavior because the light sheet scans over the eye, interfering with presentation of controlled visual stimuli. Here we describe a system that overcomes the confounding eye stimulation through the use of two light sheets and combines whole-brain light-sheet imaging with virtual reality for fictively behaving larval zebrafish.
Understanding brain function requires monitoring and interpreting the activity of large networks of neurons during behavior. Advances in recording technology are greatly increasing the size and complexity of neural data. Analyzing such data will pose a fundamental bottleneck for neuroscience. We present a library of analytical tools called Thunder built on the open-source Apache Spark platform for large-scale distributed computing. The library implements a variety of univariate and multivariate analyses with a modular, extendable structure well-suited to interactive exploration and analysis development. We demonstrate how these analyses find structure in large-scale neural data, including whole-brain light-sheet imaging data from fictively behaving larval zebrafish, and two-photon imaging data from behaving mouse. The analyses relate neuronal responses to sensory input and behavior, run in minutes or less and can be used on a private cluster or in the cloud. Our open-source framework thus holds promise for turning brain activity mapping efforts into biological insights.
Discrete populations of brainstem spinal projection neurons (SPNs) have been shown to exhibit behavior-specific responses during locomotion [1-9], suggesting that separate descending pathways, each dedicated to a specific behavior, control locomotion. In an alternative model, a large variety of motor outputs could be generated from different combinations of a small number of basic motor pathways. We examined this possibility by studying the precise role of ventromedially located hindbrain SPNs (vSPNs) in generating turning behaviors. We found that unilateral laser ablation of vSPNs reduces the tail deflection and cycle period specifically during the first undulation cycle of a swim bout, whereas later tail movements are unaffected. This holds true during phototaxic [10], optomotor [11], dark-flash-induced [12], and spontaneous turns [13], suggesting a universal role of these neurons in controlling turning behaviors. Importantly, we found that the ablation not only abolishes turns but also results in a dramatic increase in the number of forward swims, suggesting that these neurons transform forward swims into turns by introducing turning kinematics into a basic motor pattern of symmetric tail undulations. Finally, we show that vSPN activity is direction specific and graded by turning angle. Together, these results provide a clear example of how a specific motor pattern can be transformed into different behavioral events by the graded activation of a small set of SPNs.
A full understanding of nervous system function requires recording from large populations of neurons during naturalistic behaviors. Here we enable paralyzed larval zebrafish to fictively navigate two-dimensional virtual environments while we record optically from many neurons with two-photon imaging. Electrical recordings from motor nerves in the tail are decoded into intended forward swims and turns, which are used to update a virtual environment displayed underneath the fish. Several behavioral features-such as turning responses to whole-field motion and dark avoidance-are well-replicated in this virtual setting. We readily observed neuronal populations in the hindbrain with laterally selective responses that correlated with right or left optomotor behavior. We also observed neurons in the habenula, pallium, and midbrain with response properties specific to environmental features. Beyond single-cell correlations, the classification of network activity in such virtual settings promises to reveal principles of brainwide neural dynamics during behavior.