Filter
Associated Lab
- Ahrens Lab (5) Apply Ahrens Lab filter
- Aso Lab (1) Apply Aso Lab filter
- Baker Lab (2) Apply Baker Lab filter
- Betzig Lab (4) Apply Betzig Lab filter
- Bock Lab (2) Apply Bock Lab filter
- Cardona Lab (1) Apply Cardona Lab filter
- Cui Lab (2) Apply Cui Lab filter
- Dickson Lab (3) Apply Dickson Lab filter
- Druckmann Lab (1) Apply Druckmann Lab filter
- Dudman Lab (2) Apply Dudman Lab filter
- Eddy/Rivas Lab (2) Apply Eddy/Rivas Lab filter
- Egnor Lab (1) Apply Egnor Lab filter
- Fetter Lab (3) Apply Fetter Lab filter
- Fitzgerald Lab (1) Apply Fitzgerald Lab filter
- Gonen Lab (11) Apply Gonen Lab filter
- Grigorieff Lab (4) Apply Grigorieff Lab filter
- Harris Lab (3) Apply Harris Lab filter
- Heberlein Lab (6) Apply Heberlein Lab filter
- Hermundstad Lab (1) Apply Hermundstad Lab filter
- Hess Lab (2) Apply Hess Lab filter
- Jayaraman Lab (3) Apply Jayaraman Lab filter
- Ji Lab (1) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (1) Apply Karpova Lab filter
- Keller Lab (10) Apply Keller Lab filter
- Lavis Lab (4) Apply Lavis Lab filter
- Leonardo Lab (3) Apply Leonardo Lab filter
- Lippincott-Schwartz Lab (11) Apply Lippincott-Schwartz Lab filter
- Looger Lab (10) Apply Looger Lab filter
- Magee Lab (3) Apply Magee Lab filter
- Menon Lab (3) Apply Menon Lab filter
- Pachitariu Lab (3) Apply Pachitariu Lab filter
- Pavlopoulos Lab (1) Apply Pavlopoulos Lab filter
- Reiser Lab (2) Apply Reiser Lab filter
- Riddiford Lab (5) Apply Riddiford Lab filter
- Romani Lab (1) Apply Romani Lab filter
- Rubin Lab (5) Apply Rubin Lab filter
- Satou Lab (2) Apply Satou Lab filter
- Scheffer Lab (3) Apply Scheffer Lab filter
- Schreiter Lab (7) Apply Schreiter Lab filter
- Sgro Lab (1) Apply Sgro Lab filter
- Singer Lab (9) Apply Singer Lab filter
- Spruston Lab (2) Apply Spruston Lab filter
- Stern Lab (6) Apply Stern Lab filter
- Sternson Lab (3) Apply Sternson Lab filter
- Svoboda Lab (10) Apply Svoboda Lab filter
- Tjian Lab (1) Apply Tjian Lab filter
- Truman Lab (3) Apply Truman Lab filter
- Turaga Lab (2) Apply Turaga Lab filter
- Turner Lab (2) Apply Turner Lab filter
- Wu Lab (3) Apply Wu Lab filter
- Zlatic Lab (2) Apply Zlatic Lab filter
Associated Project Team
Publication Date
- December 2013 (13) Apply December 2013 filter
- November 2013 (10) Apply November 2013 filter
- October 2013 (20) Apply October 2013 filter
- September 2013 (19) Apply September 2013 filter
- August 2013 (15) Apply August 2013 filter
- July 2013 (19) Apply July 2013 filter
- June 2013 (17) Apply June 2013 filter
- May 2013 (10) Apply May 2013 filter
- April 2013 (12) Apply April 2013 filter
- March 2013 (11) Apply March 2013 filter
- February 2013 (19) Apply February 2013 filter
- January 2013 (29) Apply January 2013 filter
- Remove 2013 filter 2013
Type of Publication
194 Publications
Showing 131-140 of 194 resultsElectron crystallography is arguably the only electron cryomicroscopy (cryo EM) technique able to deliver atomic resolution data (better then 3 Å) for membrane proteins embedded in a membrane. The progress in hardware improvements and sample preparation for diffraction analysis resulted in a number of recent examples where increasingly higher resolutions were achieved. Other chapters in this book detail the improvements in hardware and delve into the intricate art of sample preparation for microscopy and electron diffraction data collection and processing. In this chapter, we describe in detail the protocols for molecular replacement for electron diffraction studies. The use of a search model for phasing electron diffraction data essentially eliminates the need of acquiring image data rendering it immune to aberrations from drift and charging effects that effectively lower the attainable resolution.
Optical approaches for tracking neural dynamics are of widespread interest, but a theoretical framework quantifying the physical limits of these techniques has been lacking. We formulate such a framework by using signal detection and estimation theory to obtain physical bounds on the detection of neural spikes and the estimation of their occurrence times as set by photon counting statistics (shot noise). These bounds are succinctly expressed via a discriminability index that depends on the kinetics of the optical indicator and the relative fluxes of signal and background photons. This approach facilitates quantitative evaluations of different indicators, detector technologies, and data analyses. Our treatment also provides optimal filtering techniques for optical detection of spikes. We compare various types of Ca(2+) indicators and show that background photons are a chief impediment to voltage sensing. Thus, voltage indicators that change color in response to membrane depolarization may offer a key advantage over those that change intensity. We also examine fluorescence resonance energy transfer indicators and identify the regimes in which the widely used ratiometric analysis of signals is substantially suboptimal. Overall, by showing how different optical factors interact to affect signal quality, our treatment offers a valuable guide to experimental design and provides measures of confidence to assess optically extracted traces of neural activity.
Active dendritic synaptic integration enhances the computational power of neurons. Such nonlinear processing generates an object-localization signal in the apical dendritic tuft of layer 5B cortical pyramidal neurons during sensory-motor behavior. Here, we employ electrophysiological and optical approaches in brain slices and behaving animals to investigate how excitatory synaptic input to this distal dendritic compartment influences neuronal output. We find that active dendritic integration throughout the apical dendritic tuft is highly compartmentalized by voltage-gated potassium (KV) channels. A high density of both transient and sustained KV channels was observed in all apical dendritic compartments. These channels potently regulated the interaction between apical dendritic tuft, trunk, and axosomatic integration zones to control neuronal output in vitro as well as the engagement of dendritic nonlinear processing in vivo during sensory-motor behavior. Thus, KV channels dynamically tune the interaction between active dendritic integration compartments in layer 5B pyramidal neurons to shape behaviorally relevant neuronal computations.
The pea aphid, Acyrthosiphon pisum, exhibits several environmentally cued, discrete, alternate phenotypes (polyphenisms) during its life cycle. In the case of the reproductive polyphenism, differences in day length determine whether mothers will produce daughters that reproduce either sexually by laying fertilized eggs (oviparous sexual reproduction), or asexually by allowing oocytes to complete embryogenesis within the mother without fertilization (viviparous parthenogenesis). Oocytes and embryos that are produced asexually develop more rapidly, are yolk-free, and much smaller than oocytes and embryos that are produced sexually. Perhaps most striking, the process of oocyte differentiation is truncated in the case of asexual/viviparous development, potentially precluding interactions between the oocyte and surrounding follicle cells that might take place during sexual/oviparous development. Given the important patterning roles that oocyte-follicle cell interactions play in Drosophila, these overt differences suggest that there may be underlying differences in the molecular mechanisms of pattern formation. We have found differences in the expression of homologs of torso-like and tailless, as well as activated MAP kinase, suggesting that there are important differences in the hemipteran version of the terminal patterning system between viviparous and oviparous development. Establishing such differences in the expression of patterning genes between these developmental modes is a first step toward understanding how a single genome manages to direct patterning events in such different embryological contexts.
The distinctive distributions of proteins within subcellular compartments both at steady state and during signaling events have essential roles in cell function. Here we describe a method for delineating the complex arrangement of proteins within subcellular structures visualized using point-localization superresolution (PL-SR) imaging. The approach, called pair correlation photoactivated localization microscopy (PC-PALM), uses a pair-correlation algorithm to precisely identify single molecules in PL-SR imaging data sets, and it is used to decipher quantitative features of protein organization within subcellular compartments, including the existence of protein clusters and the size, density and number of proteins in these clusters. We provide a step-by-step protocol for PC-PALM, illustrating its analysis capability for four plasma membrane proteins tagged with photoactivatable GFP (PAGFP). The experimental steps for PC-PALM can be carried out in 3 d and the analysis can be done in ∼6-8 h. Researchers need to have substantial experience in single-molecule imaging and statistical analysis to conduct the experiments and carry out this analysis.
A new generation of direct electron detectors for transmission electron microscopy (TEM) promises significant improvement over previous detectors in terms of their modulation transfer function (MTF) and detective quantum efficiency (DQE). However, the performance of these new detectors needs to be carefully monitored in order to optimize imaging conditions and check for degradation over time. We have developed an easy-to-use software tool, FindDQE, to measure MTF and DQE of electron detectors using images of a microscope’s built-in beam stop. Using this software, we have determined the DQE curves of four direct electron detectors currently available: the Gatan K2 Summit, the FEI Falcon I and II, and the Direct Electron DE-12, under a variety of total dose and dose rate conditions. We have additionally measured the curves for the Gatan US4000 and TVIPS TemCam-F416 scintillator-based cameras. We compare the results from our new method with published curves.
The glucose transporter, GLUT4, redistributes to the plasma membrane (PM) upon insulin stimulation, but also recycles through endosomal compartments. Different Rab proteins control these transport itineraries of GLUT4. However, the specific roles played by different Rab proteins in GLUT4 trafficking has been difficult to assess, primarily due to the complexity of endomembrane organization and trafficking. To address this problem, we recently performed advanced live cell imaging using total internal reflection fluorescence (TIRF) microscopy, which images objects ~150 nm from the PM, directly visualizing GLUT4 trafficking in response to insulin stimulation. Using IRAP-pHluorin to selectively label GSVs undergoing PM fusion in response to insulin, we identified Rab10 as the only Rab protein that binds this compartment. Rab14 was found to label transferrin-positive, endosomal compartments containing GLUT4. These also could fuse with the PM in response to insulin, albeit more slowly. Several other Rab proteins, including Rab4A, 4B and 8A, were found to mediate GLUT4 intra-endosomal recycling, serving to internalize surface-bound GLUT4 into endosomal compartments for ultimate delivery to GSVs. Thus, multiple Rab proteins regulate the circulation of GLUT4 molecules within the endomembrane system, maintaining optimal insulin responsiveness within cells.
Transcription is reported to be spatially compartmentalized in nuclear transcription factories with clusters of RNA polymerase II (Pol II). However, little is known about when these foci assemble or their relative stability. We developed a quantitative single-cell approach to characterize protein spatiotemporal organization, with single-molecule sensitivity in live eukaryotic cells. We observed that Pol II clusters form transiently, with an average lifetime of 5.1 (± 0.4) seconds, which refutes the notion that they are statically assembled substructures. Stimuli affecting transcription yielded orders-of-magnitude changes in the dynamics of Pol II clusters, which implies that clustering is regulated and plays a role in the cell’s ability to effect rapid response to external signals. Our results suggest that transient crowding of enzymes may aid in rate-limiting steps of gene regulation.
Population neural recordings with long-range temporal structure are often best understood in terms of a shared underlying low-dimensional dynamical process. Advances in recording technology provide access to an ever larger fraction of the population, but the standard computational approaches available to identify the collective dynamics scale poorly with the size of the dataset. Here we describe a new, scalable approach to discovering the low-dimensional dynamics that underlie simultaneously recorded spike trains from a neural population. Our method is based on recurrent linear models (RLMs), and relates closely to timeseries models based on recurrent neural networks. We formulate RLMs for neural data by generalising the Kalman-filter-based likelihood calculation for latent linear dynamical systems (LDS) models to incorporate a generalised-linear observation process. We show that RLMs describe motor-cortical population data better than either directly-coupled generalised-linear models or latent linear dynamical system models with generalised-linear observations. We also introduce the cascaded linear model (CLM) to capture low-dimensional instantaneous correlations in neural populations. The CLM describes the cortical recordings better than either Ising or Gaussian models and, like the RLM, can be fit exactly and quickly. The CLM can also be seen as a generalization of a low-rank Gaussian model, in this case factor analysis. The computational tractability of the RLM and CLM allow both to scale to very high-dimensional neural data.
Neural language models (LMs) based on recurrent neural networks (RNN) are some of the most successful word and character-level LMs. Why do they work so well, in particular better than linear neural LMs? Possible explanations are that RNNs have an implicitly better regularization or that RNNs have a higher capacity for storing patterns due to their nonlinearities or both. Here we argue for the first explanation in the limit of little training data and the second explanation for large amounts of text data. We show state-of-the-art performance on the popular and small Penn dataset when RNN LMs are regularized with random dropout. Nonetheless, we show even better performance from a simplified, much less expressive linear RNN model without off-diagonal entries in the recurrent matrix. We call this model an impulse-response LM (IRLM). Using random dropout, column normalization and annealed learning rates, IRLMs develop neurons that keep a memory of up to 50 words in the past and achieve a perplexity of 102.5 on the Penn dataset. On two large datasets however, the same regularization methods are unsuccessful for both models and the RNN's expressivity allows it to overtake the IRLM by 10 and 20 percent perplexity, respectively. Despite the perplexity gap, IRLMs still outperform RNNs on the Microsoft Research Sentence Completion (MRSC) task. We develop a slightly modified IRLM that separates long-context units (LCUs) from short-context units and show that the LCUs alone achieve a state-of-the-art performance on the MRSC task of 60.8%. Our analysis indicates that a fruitful direction of research for neural LMs lies in developing more accessible internal representations, and suggests an optimization regime of very high momentum terms for effectively training such models.