Filter
Associated Lab
- Ahrens Lab (41) Apply Ahrens Lab filter
- Aso Lab (39) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (98) Apply Betzig Lab filter
- Beyene Lab (4) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (45) Apply Branson Lab filter
- Card Lab (32) Apply Card Lab filter
- Cardona Lab (44) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (10) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (34) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (12) Apply Espinosa Medina Lab filter
- Feliciano Lab (6) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- Fitzgerald Lab (14) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (33) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (48) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (17) Apply Hermundstad Lab filter
- Hess Lab (65) Apply Hess Lab filter
- Ilanges Lab (1) Apply Ilanges Lab filter
- Jayaraman Lab (39) Apply Jayaraman Lab filter
- Ji Lab (32) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (60) Apply Keller Lab filter
- Lavis Lab (120) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (1) Apply Li Lab filter
- Lippincott-Schwartz Lab (84) Apply Lippincott-Schwartz Lab filter
- Liu (Zhe) Lab (51) Apply Liu (Zhe) Lab filter
- Looger Lab (136) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (3) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (28) Apply Pachitariu Lab filter
- Pastalkova Lab (5) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (1) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (42) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (28) Apply Romani Lab filter
- Rubin Lab (100) Apply Rubin Lab filter
- Saalfeld Lab (41) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (44) Apply Schreiter Lab filter
- Shroff Lab (20) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (55) Apply Spruston Lab filter
- Stern Lab (67) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (23) Apply Stringer Lab filter
- Svoboda Lab (131) Apply Svoboda Lab filter
- Tebo Lab (7) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (12) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (34) Apply Turaga Lab filter
- Turner Lab (24) Apply Turner Lab filter
- Vale Lab (6) Apply Vale Lab filter
- Voigts Lab (1) Apply Voigts Lab filter
- Wang (Meng) Lab (7) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (1) Apply Wang (Shaohe) Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (1) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- Fly Descending Interneuron (10) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (48) Apply FlyEM filter
- FlyLight (46) Apply FlyLight filter
- GENIE (40) Apply GENIE filter
- Integrative Imaging (1) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (16) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (22) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (33) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (10) Apply Electron Microscopy filter
- Fly Facility (39) Apply Fly Facility filter
- Gene Targeting and Transgenics (10) Apply Gene Targeting and Transgenics filter
- Integrative Imaging (10) Apply Integrative Imaging filter
- Janelia Experimental Technology (35) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Primary & iPS Cell Culture (13) Apply Primary & iPS Cell Culture filter
- Project Technical Resources (31) Apply Project Technical Resources filter
- Quantitative Genomics (18) Apply Quantitative Genomics filter
- Scientific Computing Software (56) Apply Scientific Computing Software filter
- Scientific Computing Systems (6) Apply Scientific Computing Systems filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (6) Apply Vivarium filter
Publication Date
- 2024 (86) Apply 2024 filter
- 2023 (177) Apply 2023 filter
- 2022 (166) Apply 2022 filter
- 2021 (174) Apply 2021 filter
- 2020 (178) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2453 Janelia Publications
Showing 101-110 of 2453 resultsColor and motion are used by many species to identify salient objects. They are processed largely independently, but color contributes to motion processing in humans, for example, enabling moving colored objects to be detected when their luminance matches the background. Here, we demonstrate an unexpected, additional contribution of color to motion vision in Drosophila. We show that behavioral ON-motion responses are more sensitive to UV than for OFF-motion, and we identify cellular pathways connecting UV-sensitive R7 photoreceptors to ON and OFF-motion-sensitive T4 and T5 cells, using neurogenetics and calcium imaging. Remarkably, this contribution of color circuitry to motion vision enhances the detection of approaching UV discs, but not green discs with the same chromatic contrast, and we show how this could generalize for systems with ON- and OFF-motion pathways. Our results provide a computational and circuit basis for how color enhances motion vision to favor the detection of saliently colored objects.
Color and motion are used by many species to identify salient objects. They are processed largely independently, but color contributes to motion processing in humans, for example, enabling moving colored objects to be detected when their luminance matches the background. Here, we demonstrate an unexpected, additional contribution of color to motion vision in Drosophila. We show that behavioral ON-motion responses are more sensitive to UV than for OFF-motion, and we identify cellular pathways connecting UV-sensitive R7 photoreceptors to ON and OFF-motion-sensitive T4 and T5 cells, using neurogenetics and calcium imaging. Remarkably, this contribution of color circuitry to motion vision enhances the detection of approaching UV discs, but not green discs with the same chromatic contrast, and we show how this could generalize for systems with ON- and OFF-motion pathways. Our results provide a computational and circuit basis for how color enhances motion vision to favor the detection of saliently colored objects.
Medial frontal cortical areas are thought to play a critical role in the brain's ability to flexibly deploy strategies that are effective in complex settings. Still, the specific circuit computations that underpin this foundational aspect of intelligence remain unclear. Here, by examining neural ensemble activity in rats that sample different strategies in a self-guided search for latent task structure, we demonstrate a robust tracking of individual strategy prevalence in the anterior cingulate cortex (ACC), especially in an area homologous to primate area 32D. Prevalence encoding in the ACC is wide-scale, independent of reward delivery, and persists through a substantial ensemble reorganization that tags ACC representations with contextual content. Our findings argue that ACC ensemble dynamics is structured by a summary statistic of recent behavioral choices, raising the possibility that ACC plays a role in estimating - through statistical learning - which actions promote the occurrence of events in the environment.
Survival behaviors are orchestrated by hardwired circuits located in deep subcortical brain regions, most prominently the hypothalamus. Artificial activation of spatially localized, genetically defined hypothalamic cell populations is known to trigger distinct behaviors, suggesting a nucleus-centered organization of behavioral control. However, no study has investigated the hypothalamic representation of innate behaviors using unbiased, large-scale single neuron recordings. Here, using custom silicon probes, we performed recordings across the rostro-caudal extent of the medial hypothalamus in freely moving animals engaged in a diverse array of social and predator defense (“fear”) behaviors. Nucleus-averaged activity revealed spatially distributed generic “ignition signals” that occurred at the onset of each behavior, and did not identify sparse, nucleus-specific behavioral representations. Single-unit analysis revealed that social and fear behavior classes are encoded by activity in distinct sets of spatially distributed neuronal ensembles spanning the entire hypothalamic rostro-caudal axis. Individual ensemble membership, however, was drawn from neurons in 3-4 adjacent nuclei. Mixed selectivity was identified as the most prevalent mode of behavior representation by individual hypothalamic neurons. Encoding models indicated that a significant fraction of the variance in single neuron activity is explained by behavior. This work reveals that innate behaviors are encoded in the hypothalamus by activity in spatially distributed neural ensembles that each span multiple neighboring nuclei, complementing the prevailing view of hypothalamic behavioral control by single nucleus-restricted cell types derived from perturbational studies.
Ionic driving forces provide the net electromotive force for ion movement across receptors, channels, and transporters, and are a fundamental property of all cells. In the brain for example, fast synaptic inhibition is mediated by chloride permeable GABAA receptors, and single-cell intracellular recordings have been the only method for estimating driving forces across these receptors (DFGABAA). Here we present a new tool for quantifying inhibitory receptor driving force named ORCHID: all-Optical Reporting of CHloride Ion Driving force. We demonstrate ORCHID’s ability to provide accurate, high-throughput measurements of resting and dynamic DFGABAA from genetically targeted cell types over multiple timescales. ORCHID confirms theoretical predictions about the biophysical mechanisms that establish DFGABAA, reveals novel differences in DFGABAA between neurons and astrocytes, and affords the first in vivo measurements of intact DFGABAA. This work extends our understanding of inhibitory synaptic transmission and establishes a precedent for all-optical methods to assess ionic driving forces.
Recent studies in mice have shown that orofacial behaviors drive a large fraction of neural activity across the brain. To understand the nature and function of these signals, we need better computational models to characterize the behaviors and relate them to neural activity. Here we developed Facemap, a framework consisting of a keypoint tracking algorithm and a deep neural network encoder for predicting neural activity. We used the Facemap keypoints as input for the deep neural network to predict the activity of ∼50,000 simultaneously-recorded neurons and in visual cortex we doubled the amount of explained variance compared to previous methods. Our keypoint tracking algorithm was more accurate than existing pose estimation tools, while the inference speed was several times faster, making it a powerful tool for closed-loop behavioral experiments. The Facemap tracker was easy to adapt to data from new labs, requiring as few as 10 annotated frames for near-optimal performance. We used Facemap to find that the neuronal activity clusters which were highly driven by behaviors were more spatially spread-out across cortex. We also found that the deep keypoint features inferred by the model had time-asymmetrical state dynamics that were not apparent in the raw keypoint data. In summary, Facemap provides a stepping stone towards understanding the function of the brainwide neural signals and their relation to behavior.
In the nucleus, biological processes are driven by proteins that diffuse through and bind to a meshwork of nucleic acid polymers. To better understand this interplay, we developed an imaging platform to simultaneously visualize single protein dynamics together with the local chromatin environment in live cells. Together with super-resolution imaging, new fluorescent probes, and biophysical modeling, we demonstrated that nucleosomes display differential diffusion and packing arrangements as chromatin density increases whereas the viscoelastic properties and accessibility of the interchromatin space remain constant. Perturbing nuclear functions impacted nucleosome diffusive properties in a manner that was dependent on local chromatin density and supportive of a model wherein transcription locally stabilizes nucleosomes while simultaneously allowing for the free exchange of nuclear proteins. Our results reveal that nuclear heterogeneity arises from both active and passive process and highlights the need to account for different organizational principals when modeling different chromatin environments.
Live-cell super-resolution microscopy enables the imaging of biological structure dynamics below the diffraction limit. Here we present enhanced super-resolution radial fluctuations (eSRRF), substantially improving image fidelity and resolution compared to the original SRRF method. eSRRF incorporates automated parameter optimization based on the data itself, giving insight into the trade-off between resolution and fidelity. We demonstrate eSRRF across a range of imaging modalities and biological systems. Notably, we extend eSRRF to three dimensions by combining it with multifocus microscopy. This realizes live-cell volumetric super-resolution imaging with an acquisition speed of ~1 volume per second. eSRRF provides an accessible super-resolution approach, maximizing information extraction across varied experimental conditions while minimizing artifacts. Its optimal parameter prediction strategy is generalizable, moving toward unbiased and optimized analyses in super-resolution microscopy.
Animals of the same species exhibit similar behaviours that are advantageously adapted to their body and environment. These behaviours are shaped at the species level by selection pressures over evolutionary timescales. Yet, it remains unclear how these common behavioural adaptations emerge from the idiosyncratic neural circuitry of each individual. The overall organization of neural circuits is preserved across individuals because of their common evolutionarily specified developmental programme. Such organization at the circuit level may constrain neural activity, leading to low-dimensional latent dynamics across the neural population. Accordingly, here we suggested that the shared circuit-level constraints within a species would lead to suitably preserved latent dynamics across individuals. We analysed recordings of neural populations from monkey and mouse motor cortex to demonstrate that neural dynamics in individuals from the same species are surprisingly preserved when they perform similar behaviour. Neural population dynamics were also preserved when animals consciously planned future movements without overt behaviour and enabled the decoding of planned and ongoing movement across different individuals. Furthermore, we found that preserved neural dynamics extend beyond cortical regions to the dorsal striatum, an evolutionarily older structure. Finally, we used neural network models to demonstrate that behavioural similarity is necessary but not sufficient for this preservation. We posit that these emergent dynamics result from evolutionary constraints on brain development and thus reflect fundamental properties of the neural basis of behaviour.
For most model organisms in neuroscience, research into visual processing in the brain is difficult because of a lack of high-resolution maps that capture complex neuronal circuitry. The microinsect Megaphragma viggianii, because of its small size and non-trivial behavior, provides a unique opportunity for tractable whole-organism connectomics. We image its whole head using serial electron microscopy. We reconstruct its compound eye and analyze the optical properties of the ommatidia as well as the connectome of the first visual neuropil-the lamina. Compared with the fruit fly and the honeybee, Megaphragma visual system is highly simplified: it has 29 ommatidia per eye and 6 lamina neuron types. We report features that are both stereotypical among most ommatidia and specialized to some. By identifying the "barebones" circuits critical for flying insects, our results will facilitate constructing computational models of visual processing in insects.