How do we observe the environment, decide which behavior to perform, and turn our experience into memories that determine our future behavior? These processes engage many brain areas, including sensory and motor areas, decision-related areas and neuromodulatory centers. We use modern microscopy methods to study activity across the entire brain of behaving zebrafish to map how large populations of neurons collaborate to implement the computations underlying behavior. We are particularly interested in motor learning, short-term memory and the generation of spontaneous behavior.
We study fundamental brain processes including motor learning, short-term memory and the generation of spontaneous behavior. We develop experimental techniques required to answer the questions we ask, and design algorithms for analyzing the large datasets produced by our experiments. We collaborate heavily with other groups. Below are some examples of past and ongoing projects.
Motor learning through neuromodulation
The serotonergic system mediates short-term motor learning by detecting the outcomes of swim actions and using those to modulate behavior. Kawashima et al., Cell, 2016
Mapping the brain for neural activity controlling exploratory locomotion
How does the brain control behavior when there is no guidance from the environment? Dunn et al., eLife, 2016
Left: path of a larval zebrafish spontaneously swimming in virtual reality. Middle: functionally identified neurons tuned to turning, overlaid with a map of spinal projection neurons. Right: activity of turning tuned neurons (green and magenta) and turning behavior (black).
Extended depth of field light-sheet microscopy
In collaboration with the Yuste lab we developed a system for random access light-sheet imaging: Quirin et al., Optics Letters, 2016.
Whole-brain imaging and virtual reality
We developed a system for performing whole-brain, neuron-level recordings of the brains of larval zebrafish behaving in virtual reality: Vladimirov et al., Nature Methods, 2014.
Whole-brain activity without (left) and with (middle) sensory stimulation.
Large-scale data analysis
The Freeman lab collaborated with us to create a library for large-scale neural data analysis and apply it to whole-brain data: Freeman et al., Nature Methods, 2014. Resources, including analysis examples, links to code and example data, can be found here.
Left: orientation turning throughout the brain. Right: low-dimensional representation of whole-brain responses to directional visual stimuli.
Interested in joining the lab?
We have opportunities for postdoctoral researchers, PhD students (through the Johns Hopkins / Janelia graduate program, or graduate research fellowship program), and undergraduate students (through the Janelia summer undergraduate program).
For inquiries email firstname.lastname@example.org.