Filter
Associated Lab
- Ahrens Lab (2) Apply Ahrens Lab filter
- Aso Lab (1) Apply Aso Lab filter
- Betzig Lab (7) Apply Betzig Lab filter
- Bock Lab (1) Apply Bock Lab filter
- Branson Lab (1) Apply Branson Lab filter
- Clapham Lab (1) Apply Clapham Lab filter
- Dudman Lab (1) Apply Dudman Lab filter
- Fetter Lab (3) Apply Fetter Lab filter
- Harris Lab (63) Apply Harris Lab filter
- Hess Lab (4) Apply Hess Lab filter
- Jayaraman Lab (3) Apply Jayaraman Lab filter
- Ji Lab (1) Apply Ji Lab filter
- Keller Lab (1) Apply Keller Lab filter
- Lavis Lab (3) Apply Lavis Lab filter
- Lee (Albert) Lab (7) Apply Lee (Albert) Lab filter
- Leonardo Lab (1) Apply Leonardo Lab filter
- Lippincott-Schwartz Lab (1) Apply Lippincott-Schwartz Lab filter
- Looger Lab (7) Apply Looger Lab filter
- Magee Lab (2) Apply Magee Lab filter
- Pachitariu Lab (4) Apply Pachitariu Lab filter
- Rubin Lab (3) Apply Rubin Lab filter
- Saalfeld Lab (3) Apply Saalfeld Lab filter
- Scheffer Lab (1) Apply Scheffer Lab filter
- Schreiter Lab (4) Apply Schreiter Lab filter
- Singer Lab (2) Apply Singer Lab filter
- Spruston Lab (4) Apply Spruston Lab filter
- Svoboda Lab (6) Apply Svoboda Lab filter
- Tjian Lab (1) Apply Tjian Lab filter
- Zlatic Lab (1) Apply Zlatic Lab filter
Associated Project Team
- Fly Functional Connectome (1) Apply Fly Functional Connectome filter
- Fly Olympiad (1) Apply Fly Olympiad filter
- FlyEM (1) Apply FlyEM filter
- FlyLight (1) Apply FlyLight filter
- GENIE (5) Apply GENIE filter
- MouseLight (1) Apply MouseLight filter
- Tool Translation Team (T3) (1) Apply Tool Translation Team (T3) filter
- Transcription Imaging (3) Apply Transcription Imaging filter
Publication Date
- 2025 (4) Apply 2025 filter
- 2024 (4) Apply 2024 filter
- 2023 (6) Apply 2023 filter
- 2022 (1) Apply 2022 filter
- 2021 (2) Apply 2021 filter
- 2020 (1) Apply 2020 filter
- 2019 (4) Apply 2019 filter
- 2018 (5) Apply 2018 filter
- 2017 (5) Apply 2017 filter
- 2016 (5) Apply 2016 filter
- 2015 (7) Apply 2015 filter
- 2014 (2) Apply 2014 filter
- 2013 (3) Apply 2013 filter
- 2012 (3) Apply 2012 filter
- 2010 (1) Apply 2010 filter
- 2009 (1) Apply 2009 filter
- 2008 (1) Apply 2008 filter
- 1996 (1) Apply 1996 filter
- 1994 (3) Apply 1994 filter
- 1993 (1) Apply 1993 filter
- 1992 (1) Apply 1992 filter
- 1991 (2) Apply 1991 filter
Type of Publication
63 Publications
Showing 61-63 of 63 resultsTo study the neural basis of behavior, we require methods to sensitively and accurately measure neural activity at single neuron and single spike resolution. Extracellular electrophysiology is a principal method for achieving this, but it has biases in the neurons it detects and it imperfectly resolves their action potentials. To overcome these limitations, we developed a silicon probe with significantly smaller and denser recording sites than previous designs, called Neuropixels Ultra (NP Ultra). This device measures neuronal activity at ultra-high densities (>1300 sites per mm, 10 times higher than previous probes), with 6 µm center-to-center spacing and low noise. This device effectively comprises an implantable voltage-sensing camera that captures a planar image of a neuron's electrical field. We introduce a new spike sorting algorithm optimized for these probes and use it to find that the yield of visually-responsive neurons in recordings from mouse visual cortex improves ∼3-fold. Recordings across multiple brain regions and four species revealed a subset of unexpectedly small extracellular action potentials not previously reported. Further experiments determined that, in visual cortex, these do not correspond to major subclasses of interneurons and instead likely reflect recordings from axons. Finally, using ground-truth identification of cortical inhibitory cell types with optotagging, we found that cell type was discriminable with approximately 75% success among three types, a significant improvement over lower-resolution recordings. NP Ultra improves spike sorting performance, sampling bias, and cell type classification.
The simultaneous visualization, identification and targeting of neurons during patch clamp-mediated electrophysiological recordings is a basic technique in neuroscience, yet it is often complicated by the inability to visualize the pipette tip, particularly in deep brain tissue. Here we demonstrate a novel approach in which fluorescent quantum dot probes are used to coat pipettes prior to their use. The strong two-photon absorption cross sections of the quantum dots afford robust contrast at significantly deeper penetration depths than current methods allow. We demonstrate the utility of this technique in multiple recording formats both in vitro and in vivo where imaging of the pipettes is achieved at remarkable depths (up to 800 microns). Notably, minimal perturbation of cellular physiology is observed over the hours-long time course of neuronal recordings. We discuss our results within the context of the role that quantum dot nanoprobes may play in understanding neuronal cell physiology.
The hippocampus is critical for recollecting and imagining experiences. This is believed to involve voluntarily drawing from hippocampal memory representations of people, events, and places, including maplike representations of familiar environments. However, whether representations in such "cognitive maps" can be volitionally accessed is unknown. We developed a brain-machine interface to test whether rats can do so by controlling their hippocampal activity in a flexible, goal-directed, and model-based manner. We found that rats can efficiently navigate or direct objects to arbitrary goal locations within a virtual reality arena solely by activating and sustaining appropriate hippocampal representations of remote places. This provides insight into the mechanisms underlying episodic memory recall, mental simulation and planning, and imagination and opens up possibilities for high-level neural prosthetics that use hippocampal representations.