Filter
Associated Lab
- Ahrens Lab (41) Apply Ahrens Lab filter
- Aso Lab (39) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (98) Apply Betzig Lab filter
- Beyene Lab (4) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (45) Apply Branson Lab filter
- Card Lab (32) Apply Card Lab filter
- Cardona Lab (44) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (10) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (34) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (12) Apply Espinosa Medina Lab filter
- Feliciano Lab (6) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- Fitzgerald Lab (14) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (33) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (47) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (17) Apply Hermundstad Lab filter
- Hess Lab (65) Apply Hess Lab filter
- Jayaraman Lab (39) Apply Jayaraman Lab filter
- Ji Lab (32) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (60) Apply Keller Lab filter
- Lavis Lab (119) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (1) Apply Li Lab filter
- Lippincott-Schwartz Lab (83) Apply Lippincott-Schwartz Lab filter
- Liu (Zhe) Lab (51) Apply Liu (Zhe) Lab filter
- Looger Lab (136) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (3) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (26) Apply Pachitariu Lab filter
- Pastalkova Lab (5) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (1) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (42) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (28) Apply Romani Lab filter
- Rubin Lab (100) Apply Rubin Lab filter
- Saalfeld Lab (41) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (44) Apply Schreiter Lab filter
- Shroff Lab (18) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (55) Apply Spruston Lab filter
- Stern Lab (67) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (21) Apply Stringer Lab filter
- Svoboda Lab (131) Apply Svoboda Lab filter
- Tebo Lab (6) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (12) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (34) Apply Turaga Lab filter
- Turner Lab (24) Apply Turner Lab filter
- Vale Lab (6) Apply Vale Lab filter
- Voigts Lab (1) Apply Voigts Lab filter
- Wang (Meng) Lab (6) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (1) Apply Wang (Shaohe) Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (1) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- Fly Descending Interneuron (10) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (48) Apply FlyEM filter
- FlyLight (45) Apply FlyLight filter
- GENIE (38) Apply GENIE filter
- Integrative Imaging (1) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (16) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (21) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (31) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (10) Apply Electron Microscopy filter
- Fly Facility (39) Apply Fly Facility filter
- Gene Targeting and Transgenics (10) Apply Gene Targeting and Transgenics filter
- Integrative Imaging (10) Apply Integrative Imaging filter
- Janelia Experimental Technology (35) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Primary & iPS Cell Culture (13) Apply Primary & iPS Cell Culture filter
- Project Technical Resources (31) Apply Project Technical Resources filter
- Quantitative Genomics (18) Apply Quantitative Genomics filter
- Scientific Computing Software (56) Apply Scientific Computing Software filter
- Scientific Computing Systems (6) Apply Scientific Computing Systems filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (6) Apply Vivarium filter
Publication Date
- 2024 (64) Apply 2024 filter
- 2023 (178) Apply 2023 filter
- 2022 (166) Apply 2022 filter
- 2021 (174) Apply 2021 filter
- 2020 (178) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2432 Janelia Publications
Showing 141-150 of 2432 resultsAnimal brains are complex organs composed of thousands of interconnected neurons. Characterizing the network properties of these brains is a requisite step towards understanding mechanisms of computation and information flow. With the completion of the Flywire project, we now have access to the connectome of a complete adult Drosophila brain, containing 130,000 neurons and millions of connections. Here, we present a statistical summary and data products of the Flywire connectome, delving into its network properties and topological features. To gain insights into local connectivity, we computed the prevalence of two- and three-node network motifs, examined their strengths and neurotransmitter compositions, and compared these topological metrics with wiring diagrams of other animals. We uncovered a population of highly connected neurons known as the “rich club” and identified subsets of neurons that may serve as integrators or broadcasters of signals. Finally, we examined subnetworks based on 78 anatomically defined brain regions. The freely available data and neuron populations presented here will serve as a foundation for models and experiments exploring the relationship between neural activity and anatomical structure.
Neurophysiology has long progressed through exploratory experiments and chance discoveries. Anecdotes abound of researchers setting up experiments while listening to spikes in real time and observing a pattern of consistent firing when certain stimuli or behaviors happened. With the advent of large-scale recordings, such close observation of data has become harder because high-dimensional spaces are impenetrable to our pattern-finding intuitions. To help ourselves find patterns in neural data, our lab has been openly developing a visualization framework known as “Rastermap” over the past five years. Rastermap takes advantage of a new global optimization algorithm for sorting neural responses along a one-dimensional manifold. Displayed as a raster plot, the sorted neurons show a variety of activity patterns, which can be more easily identified and interpreted. We first benchmark Rastermap on realistic simulations with multiplexed cognitive variables. Then we demonstrate it on recordings of tens of thousands of neurons from mouse visual and sensorimotor cortex during spontaneous, stimulus-evoked and task-evoked epochs, as well as on whole-brain zebrafish recordings, widefield calcium imaging data, population recordings from rat hippocampus and artificial neural networks. Finally, we illustrate high-dimensional scenarios where Rastermap and similar algorithms cannot be used effectively.
There is rich variety in the activity of single neurons recorded during behaviour. Yet, these diverse single neuron responses can be well described by relatively few patterns of neural co-modulation. The study of such low-dimensional structure of neural population activity has provided important insights into how the brain generates behaviour. Virtually all of these studies have used linear dimensionality reduction techniques to estimate these population-wide co-modulation patterns, constraining them to a flat "neural manifold". Here, we hypothesised that since neurons have nonlinear responses and make thousands of distributed and recurrent connections that likely amplify such nonlinearities, neural manifolds should be intrinsically nonlinear. Combining neural population recordings from monkey motor cortex, mouse motor cortex, mouse striatum, and human motor cortex, we show that: 1) neural manifolds are intrinsically nonlinear; 2) the degree of their nonlinearity varies across architecturally distinct brain regions; and 3) manifold nonlinearity becomes more evident during complex tasks that require more varied activity patterns. Simulations using recurrent neural network models confirmed the proposed relationship between circuit connectivity and manifold nonlinearity, including the differences across architecturally distinct regions. Thus, neural manifolds underlying the generation of behaviour are inherently nonlinear, and properly accounting for such nonlinearities will be critical as neuroscientists move towards studying numerous brain regions involved in increasingly complex and naturalistic behaviours.
The growing size of EM volumes is a significant barrier to findable, accessible, interoperable, and reusable (FAIR) sharing. Storage, sharing, visualization and processing are challenging for large datasets. Here we discuss a recent development toward the standardized storage of volume electron microscopy (vEM) data which addresses many of the issues that researchers face. The OME-Zarr format splits data into more manageable, performant chunks enabling streaming-based access, and unifies important metadata such as multiresolution pyramid descriptions. The file format is designed for centralized and remote storage (e.g., cloud storage or file system) and is therefore ideal for sharing large data. By coalescing on a common, community-wide format, these benefits will expand as ever more data is made available to the scientific community.
Targeting deep brain structures during electrophysiology and injections requires intensive training and expertise. Even with experience, researchers often can't be certain that a probe is placed precisely in a target location and this complexity scales with the number of simultaneous probes used in an experiment. Here, we present Pinpoint, open-source software that allows for interactive exploration of stereotaxic insertion plans. Once an insertion plan is created, Pinpoint allows users to save these online and share them with collaborators. 3D modeling tools allow users to explore their insertions alongside rig and implant hardware and ensure plans are physically possible. Probes in Pinpoint can be linked to electronic micro-manipulators allowing real-time visualization of current brain region targets alongside neural data. In addition, Pinpoint can control manipulators to automate and parallelize the insertion process. Compared to previously available software, Pinpoint's easy access through web browsers, extensive features, and real-time experiment integration enable more efficient and reproducible recordings.
Single molecule localization microscopy relies on the precise quantification of the position of single dye emitters in a sample. This precision is improved by the number of photons that can be detected from each molecule. Particularly recording at cryogenic temperatures dramatically reduces photobleaching and would, hence, in principle, allow the user to massively increase the illumination time to several seconds. The downside of long illuminations, however, would be image blur due to inevitable jitter or drift occurring during the illuminations, which deteriorates the localization precision. In this paper, we theoretically demonstrate that a parallel recording of the fiducial marker beads together with a fitting approach accounting for the full drift trajectory allows for largely eliminating drift effects for drift magnitudes of several hundred nanometers per frame. We showcase the method for linear and diffusional drift as well as oscillations, assuming fixed dipole orientations during each illumination.
The ability to discriminate sensory stimuli with overlapping features is thought to arise in brain structures called expansion layers, where neurons carrying information about sensory features make combinatorial connections onto a much larger set of cells. For 50 years, expansion coding has been a prime topic of theoretical neuroscience, which seeks to explain how quantitative parameters of the expansion circuit influence sensory sensitivity, discrimination, and generalization. Here, we investigate the developmental events that produce the quantitative parameters of the arthropod expansion layer, called the mushroom body. Using Drosophila melanogaster as a model, we employ genetic and chemical tools to engineer changes to circuit development. These allow us to produce living animals with hypothesis-driven variations on natural expansion layer wiring parameters. We then test the functional and behavioral consequences. By altering the number of expansion layer neurons (Kenyon cells) and their dendritic complexity, we find that input density, but not cell number, tunes neuronal odor selectivity. Simple odor discrimination behavior is maintained when the Kenyon cell number is reduced and augmented by Kenyon cell number expansion. Animals with increased input density to each Kenyon cell show increased overlap in Kenyon cell odor responses and become worse at odor discrimination tasks.
Life exists in three dimensions, but until the turn of the century most electron microscopy methods provided only 2D image data. Recently, electron microscopy techniques capable of delving deep into the structure of cells and tissues have emerged, collectively called volume electron microscopy (vEM). Developments in vEM have been dubbed a quiet revolution as the field evolved from established transmission and scanning electron microscopy techniques, so early publications largely focused on the bioscience applications rather than the underlying technological breakthroughs. However, with an explosion in the uptake of vEM across the biosciences and fast-paced advances in volume, resolution, throughput and ease of use, it is timely to introduce the field to new audiences. In this Primer, we introduce the different vEM imaging modalities, the specialized sample processing and image analysis pipelines that accompany each modality and the types of information revealed in the data. We showcase key applications in the biosciences where vEM has helped make breakthrough discoveries and consider limitations and future directions. We aim to show new users how vEM can support discovery science in their own research fields and inspire broader uptake of the technology, finally allowing its full adoption into mainstream biological imaging.