Filter
Associated Lab
- Aguilera Castrejon Lab (16) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (63) Apply Ahrens Lab filter
- Aso Lab (40) Apply Aso Lab filter
- Baker Lab (38) Apply Baker Lab filter
- Betzig Lab (112) Apply Betzig Lab filter
- Beyene Lab (13) Apply Beyene Lab filter
- Bock Lab (17) Apply Bock Lab filter
- Branson Lab (52) Apply Branson Lab filter
- Card Lab (41) Apply Card Lab filter
- Cardona Lab (63) Apply Cardona Lab filter
- Chklovskii Lab (13) Apply Chklovskii Lab filter
- Clapham Lab (14) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (12) Apply Darshan Lab filter
- Dennis Lab (1) Apply Dennis Lab filter
- Dickson Lab (46) Apply Dickson Lab filter
- Druckmann Lab (25) Apply Druckmann Lab filter
- Dudman Lab (50) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (11) Apply Egnor Lab filter
- Espinosa Medina Lab (19) Apply Espinosa Medina Lab filter
- Feliciano Lab (7) Apply Feliciano Lab filter
- Fetter Lab (41) Apply Fetter Lab filter
- Fitzgerald Lab (29) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (38) Apply Funke Lab filter
- Gonen Lab (91) Apply Gonen Lab filter
- Grigorieff Lab (62) Apply Grigorieff Lab filter
- Harris Lab (60) Apply Harris Lab filter
- Heberlein Lab (94) Apply Heberlein Lab filter
- Hermundstad Lab (26) Apply Hermundstad Lab filter
- Hess Lab (77) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (46) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (6) Apply Johnson Lab filter
- Kainmueller Lab (19) Apply Kainmueller Lab filter
- Karpova Lab (14) Apply Karpova Lab filter
- Keleman Lab (13) Apply Keleman Lab filter
- Keller Lab (76) Apply Keller Lab filter
- Koay Lab (18) Apply Koay Lab filter
- Lavis Lab (148) Apply Lavis Lab filter
- Lee (Albert) Lab (34) Apply Lee (Albert) Lab filter
- Leonardo Lab (23) Apply Leonardo Lab filter
- Li Lab (28) Apply Li Lab filter
- Lippincott-Schwartz Lab (168) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (6) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (61) Apply Liu (Zhe) Lab filter
- Looger Lab (138) Apply Looger Lab filter
- Magee Lab (49) Apply Magee Lab filter
- Menon Lab (18) Apply Menon Lab filter
- Murphy Lab (13) Apply Murphy Lab filter
- O'Shea Lab (6) Apply O'Shea Lab filter
- Otopalik Lab (13) Apply Otopalik Lab filter
- Pachitariu Lab (47) Apply Pachitariu Lab filter
- Pastalkova Lab (18) Apply Pastalkova Lab filter
- Pavlopoulos Lab (19) Apply Pavlopoulos Lab filter
- Pedram Lab (15) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (51) Apply Reiser Lab filter
- Riddiford Lab (44) Apply Riddiford Lab filter
- Romani Lab (43) Apply Romani Lab filter
- Rubin Lab (143) Apply Rubin Lab filter
- Saalfeld Lab (63) Apply Saalfeld Lab filter
- Satou Lab (16) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (67) Apply Schreiter Lab filter
- Sgro Lab (21) Apply Sgro Lab filter
- Shroff Lab (30) Apply Shroff Lab filter
- Simpson Lab (23) Apply Simpson Lab filter
- Singer Lab (80) Apply Singer Lab filter
- Spruston Lab (93) Apply Spruston Lab filter
- Stern Lab (156) Apply Stern Lab filter
- Sternson Lab (54) Apply Sternson Lab filter
- Stringer Lab (35) Apply Stringer Lab filter
- Svoboda Lab (135) Apply Svoboda Lab filter
- Tebo Lab (33) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (21) Apply Tillberg Lab filter
- Tjian Lab (64) Apply Tjian Lab filter
- Truman Lab (88) Apply Truman Lab filter
- Turaga Lab (50) Apply Turaga Lab filter
- Turner Lab (37) Apply Turner Lab filter
- Vale Lab (7) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (18) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (25) Apply Wang (Shaohe) Lab filter
- Wu Lab (9) Apply Wu Lab filter
- Zlatic Lab (28) Apply Zlatic Lab filter
- Zuker Lab (25) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (2) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (10) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (53) Apply FlyEM filter
- FlyLight (49) Apply FlyLight filter
- GENIE (45) Apply GENIE filter
- Integrative Imaging (3) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (26) Apply Tool Translation Team (T3) filter
- Transcription Imaging (49) Apply Transcription Imaging filter
Publication Date
- 2025 (92) Apply 2025 filter
- 2024 (221) Apply 2024 filter
- 2023 (160) Apply 2023 filter
- 2022 (193) Apply 2022 filter
- 2021 (194) Apply 2021 filter
- 2020 (196) Apply 2020 filter
- 2019 (202) Apply 2019 filter
- 2018 (232) Apply 2018 filter
- 2017 (217) Apply 2017 filter
- 2016 (209) Apply 2016 filter
- 2015 (252) Apply 2015 filter
- 2014 (236) Apply 2014 filter
- 2013 (194) Apply 2013 filter
- 2012 (190) Apply 2012 filter
- 2011 (190) Apply 2011 filter
- 2010 (161) Apply 2010 filter
- 2009 (158) Apply 2009 filter
- 2008 (140) Apply 2008 filter
- 2007 (106) Apply 2007 filter
- 2006 (92) Apply 2006 filter
- 2005 (67) Apply 2005 filter
- 2004 (57) Apply 2004 filter
- 2003 (58) Apply 2003 filter
- 2002 (39) Apply 2002 filter
- 2001 (28) Apply 2001 filter
- 2000 (29) Apply 2000 filter
- 1999 (14) Apply 1999 filter
- 1998 (18) Apply 1998 filter
- 1997 (16) Apply 1997 filter
- 1996 (10) Apply 1996 filter
- 1995 (18) Apply 1995 filter
- 1994 (12) Apply 1994 filter
- 1993 (10) Apply 1993 filter
- 1992 (6) Apply 1992 filter
- 1991 (11) Apply 1991 filter
- 1990 (11) Apply 1990 filter
- 1989 (6) Apply 1989 filter
- 1988 (1) Apply 1988 filter
- 1987 (7) Apply 1987 filter
- 1986 (4) Apply 1986 filter
- 1985 (5) Apply 1985 filter
- 1984 (2) Apply 1984 filter
- 1983 (2) Apply 1983 filter
- 1982 (3) Apply 1982 filter
- 1981 (3) Apply 1981 filter
- 1980 (1) Apply 1980 filter
- 1979 (1) Apply 1979 filter
- 1976 (2) Apply 1976 filter
- 1973 (1) Apply 1973 filter
- 1970 (1) Apply 1970 filter
- 1967 (1) Apply 1967 filter
Type of Publication
4079 Publications
Showing 2981-2990 of 4079 resultsTo successfully perform goal-directed navigation, animals must know where they are and what they are doing-e.g., looking for water, bringing food back to the nest, or escaping from a predator. Hippocampal neurons code for these critical variables conjunctively, but little is known about how this "where/what" code is formed or flexibly routed to other brain regions. To address these questions, we performed intracellular whole-cell recordings in mouse CA1 during a cued, two-choice virtual navigation task. We demonstrate that plateau potentials in CA1 pyramidal neurons rapidly strengthen synaptic inputs carrying conjunctive information about position and choice. Plasticity-induced response fields were modulated by cues only in animals previously trained to collect rewards based on available cues. Thus, we reveal that gradual learning is required for the formation of a conjunctive population code, upstream of CA1, while plateau-potential-induced synaptic plasticity in CA1 enables flexible routing of the code to downstream brain regions.
A key challenge when imaging living cells is how to noninvasively extract the most spatiotemporal information possible. Unlike popular wide-field and confocal methods, plane-illumination microscopy limits excitation to the information-rich vicinity of the focal plane, providing effective optical sectioning and high speed while minimizing out-of-focus background and premature photobleaching. Here we used scanned Bessel beams in conjunction with structured illumination and/or two-photon excitation to create thinner light sheets (<0.5 μm) better suited to three-dimensional (3D) subcellular imaging. As demonstrated by imaging the dynamics of mitochondria, filopodia, membrane ruffles, intracellular vesicles and mitotic chromosomes in live cells, the microscope currently offers 3D isotropic resolution down to \~{}0.3 μm, speeds up to nearly 200 image planes per second and the ability to noninvasively acquire hundreds of 3D data volumes from single living cells encompassing tens of thousands of image frames.
A key challenge when imaging living cells is how to noninvasively extract the most spatiotemporal information possible. Unlike popular wide-field and confocal methods, plane-illumination microscopy limits excitation to the information-rich vicinity of the focal plane, providing effective optical sectioning and high speed while minimizing out-of-focus background and premature photobleaching. Here we used scanned Bessel beams in conjunction with structured illumination and/or two-photon excitation to create thinner light sheets (<0.5 μm) better suited to three-dimensional (3D) subcellular imaging. As demonstrated by imaging the dynamics of mitochondria, filopodia, membrane ruffles, intracellular vesicles and mitotic chromosomes in live cells, the microscope currently offers 3D isotropic resolution down to \~{}0.3 μm, speeds up to nearly 200 image planes per second and the ability to noninvasively acquire hundreds of 3D data volumes from single living cells encompassing tens of thousands of image frames.
Commentary: Plane illumination microscopy has proven to be a powerful tool for studying multicellular organisms and their development at single cell resolution. However, the light sheets employed are usually too thick to provide much benefit for imaging organelles within single cultured cells. Here we introduce the use of scanned Bessel beams to create much thinner light sheets better suited to long-term dynamic live cell imaging. Such light sheets not only minimize photobleaching and phototoxicity at the sub-cellular level, but also provide axial resolution enhancement, yielding isotropic three dimensional spatial resolution. Numerous movies are provided to demonstrate the wealth of 4D information (x,y,x,t) that can be obtained from single living cells by the method. Besides providing an attractive alternative to spinning disk, AOD-driven, or line scan confocal microscopes for high speed live cell imaging, the Bessel microscope might serve as a valuable platform for superresolution microscopy (PALM, structured Illumination, or RESOLFT), since confinement of the excitation to the focal plane makes far better use of the limited fluorescence photon budget than does the traditional epi-illumination configuration.
Whole-organ imaging and characterization at a submicron level provide abundant information on development and diseases while remaining a big challenge, especially in the context of time load. Herein, a quantitative light-sheet microscopy platform that enabled highly time-efficient assessments of fibrous structures within the intact cleared tissue is developed. Dual-view inverted selective plane illumination microscopy (diSPIM), followed by improved registration and deconvolution, led to submicron isotropic imaging of mouse upper genital tract with one hundred-fold speed-ups than previous efforts. Further, optical metrics quantifying 3D local density and structural complexity of targets based on parallel and vectorized convolution in both spatial and frequency domains are developed. Collectively, ≈400–2000 fold increases in time efficiency counting for imaging, postprocessing, and quantitative characterization compared to the traditional method is gained. Using this platform, automatic identification of medulla and cortex within the mouse ovary at over 90% overlap with manual selection by anatomy experts is achieved. Additionally, heterogeneous distributions of immune cells in the mouse ovary and fallopian tube, offering a unique perspective for understanding the immune microenvironment are discovered. This work paves the way for future whole-organ study, and exhibits potential with promise for offering mechanistic insights into physiological and pathological alterations of biological tissues.
Inducible and reversible perturbation of the activity of selected neurons in vivo is critical to understanding the dynamics of brain circuits. Several genetically encoded systems for rapid inducible neuronal silencing have been developed in the past few years offering an arsenal of tools for in vivo experiments. Some systems are based on ion-channels or pumps, others on G protein coupled receptors, and yet others on modified presynaptic proteins. Inducers range from light to small molecules to peptides. This diversity results in differences in the various parameters that may determine the applicability of each tool to a particular biological question. Although further development would be beneficial, the current silencing tool kit already provides the ability to make specific perturbations of circuit function in behaving animals.
We have conducted a genetic screen for mutations that decrease the effectiveness of signaling by a protein tyrosine kinase, the product of the Drosophila melanogaster sevenless gene. These mutations define seven genes whose wild-type products may be required for signaling by sevenless. Four of the seven genes also appear to be essential for signaling by a second protein tyrosine kinase, the product of the Ellipse gene. The putative products of two of these seven genes have been identified. One encodes a ras protein. The other locus encodes a protein that is homologous to the S. cerevisiae CDC25 protein, an activator of guanine nucleotide exchange by ras proteins. These results suggest that the stimulation of ras protein activity is a key element in the signaling by sevenless and Ellipse and that this stimulation may be achieved by activating the exchange of GTP for bound GDP by the ras protein.
Neurophysiology has long progressed through exploratory experiments and chance discoveries. Anecdotes abound of researchers listening to spikes in real time and noticing patterns of activity related to ongoing stimuli or behaviors. With the advent of large-scale recordings, such close observation of data has become difficult. To find patterns in large-scale neural data, we developed 'Rastermap', a visualization method that displays neurons as a raster plot after sorting them along a one-dimensional axis based on their activity patterns. We benchmarked Rastermap on realistic simulations and then used it to explore recordings of tens of thousands of neurons from mouse cortex during spontaneous, stimulus-evoked and task-evoked epochs. We also applied Rastermap to whole-brain zebrafish recordings; to wide-field imaging data; to electrophysiological recordings in rat hippocampus, monkey frontal cortex and various cortical and subcortical regions in mice; and to artificial neural networks. Finally, we illustrate high-dimensional scenarios where Rastermap and similar algorithms cannot be used effectively.
We investigate a practical approach to solving one instantiation of a distributed hypothesis testing problem under severe rate constraints that shows up in a wide variety of applications such as camera calibration, biometric authentication and video hashing: given two distributed continuous-valued random sources, determine if they satisfy a certain Euclidean distance criterion. We show a way to convert the problem from continuous-valued to binary-valued using binarized random projections and obtain rate savings by applying a linear syndrome code. In finding visual correspondences, our approach uses just 49% of the rate of scalar quantization to achieve the same level of retrieval performance. To perform video hashing, our approach requires only a hash rate of 0.0142 bpp to identify corresponding groups of pictures correctly.
We investigate a practical approach to solving one instantiation of a distributed hypothesis testing problem under severe rate constraints that shows up in a wide variety of applications such as camera calibration, biometric authentication and video hashing: given two distributed continuous-valued random sources, determine if they satisfy a certain Euclidean distance criterion. We show a way to convert the problem from continuous-valued to binary-valued using binarized random projections and obtain rate savings by applying a linear syndrome code. In finding visual correspondences, our approach uses just 49% of the rate of scalar quantization to achieve the same level of retrieval performance. To perform video hashing, our approach requires only a hash rate of 0.0142 bpp to identify corresponding groups of pictures correctly.
We consider the problem of establishing visual correspondences in a distributed and rate-efficient fashion by broadcasting compact descriptors. Establishing visual correspondences is a critical task before other vision tasks can be performed in a camera network. We use coarsely quantized random projections of descriptors to build binary hashes, and use the hamming distance between binary hashes as a matching criterion. In this work, we show that the hamming distance between the binary hashes has a binomial distribution, with parameters that are a function of the number of random projections and the euclidean distance between the original descriptors. We present experimental results that verify our result, and show that for the task of finding visual correspondences, sending binary hashes is more rate-efficient than prior approaches.