Filter
Associated Lab
- Aguilera Castrejon Lab (1) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (52) Apply Ahrens Lab filter
- Aso Lab (40) Apply Aso Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (100) Apply Betzig Lab filter
- Beyene Lab (8) Apply Beyene Lab filter
- Bock Lab (14) Apply Bock Lab filter
- Branson Lab (49) Apply Branson Lab filter
- Card Lab (36) Apply Card Lab filter
- Cardona Lab (44) Apply Cardona Lab filter
- Chklovskii Lab (10) Apply Chklovskii Lab filter
- Clapham Lab (13) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (8) Apply Darshan Lab filter
- Dickson Lab (32) Apply Dickson Lab filter
- Druckmann Lab (21) Apply Druckmann Lab filter
- Dudman Lab (38) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (4) Apply Egnor Lab filter
- Espinosa Medina Lab (15) Apply Espinosa Medina Lab filter
- Feliciano Lab (7) Apply Feliciano Lab filter
- Fetter Lab (31) Apply Fetter Lab filter
- Fitzgerald Lab (16) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (38) Apply Funke Lab filter
- Gonen Lab (59) Apply Gonen Lab filter
- Grigorieff Lab (34) Apply Grigorieff Lab filter
- Harris Lab (50) Apply Harris Lab filter
- Heberlein Lab (13) Apply Heberlein Lab filter
- Hermundstad Lab (22) Apply Hermundstad Lab filter
- Hess Lab (74) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (42) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Karpova Lab (13) Apply Karpova Lab filter
- Keleman Lab (8) Apply Keleman Lab filter
- Keller Lab (61) Apply Keller Lab filter
- Koay Lab (2) Apply Koay Lab filter
- Lavis Lab (136) Apply Lavis Lab filter
- Lee (Albert) Lab (29) Apply Lee (Albert) Lab filter
- Leonardo Lab (19) Apply Leonardo Lab filter
- Li Lab (4) Apply Li Lab filter
- Lippincott-Schwartz Lab (96) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (1) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (56) Apply Liu (Zhe) Lab filter
- Looger Lab (137) Apply Looger Lab filter
- Magee Lab (31) Apply Magee Lab filter
- Menon Lab (12) Apply Menon Lab filter
- Murphy Lab (6) Apply Murphy Lab filter
- O'Shea Lab (5) Apply O'Shea Lab filter
- Otopalik Lab (1) Apply Otopalik Lab filter
- Pachitariu Lab (35) Apply Pachitariu Lab filter
- Pastalkova Lab (5) Apply Pastalkova Lab filter
- Pavlopoulos Lab (7) Apply Pavlopoulos Lab filter
- Pedram Lab (4) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (45) Apply Reiser Lab filter
- Riddiford Lab (20) Apply Riddiford Lab filter
- Romani Lab (31) Apply Romani Lab filter
- Rubin Lab (105) Apply Rubin Lab filter
- Saalfeld Lab (46) Apply Saalfeld Lab filter
- Satou Lab (1) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (50) Apply Schreiter Lab filter
- Sgro Lab (1) Apply Sgro Lab filter
- Shroff Lab (31) Apply Shroff Lab filter
- Simpson Lab (18) Apply Simpson Lab filter
- Singer Lab (37) Apply Singer Lab filter
- Spruston Lab (57) Apply Spruston Lab filter
- Stern Lab (73) Apply Stern Lab filter
- Sternson Lab (47) Apply Sternson Lab filter
- Stringer Lab (32) Apply Stringer Lab filter
- Svoboda Lab (131) Apply Svoboda Lab filter
- Tebo Lab (9) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (18) Apply Tillberg Lab filter
- Tjian Lab (17) Apply Tjian Lab filter
- Truman Lab (58) Apply Truman Lab filter
- Turaga Lab (39) Apply Turaga Lab filter
- Turner Lab (26) Apply Turner Lab filter
- Vale Lab (7) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (19) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (6) Apply Wang (Shaohe) Lab filter
- Wu Lab (8) Apply Wu Lab filter
- Zlatic Lab (26) Apply Zlatic Lab filter
- Zuker Lab (5) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (2) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (11) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (53) Apply FlyEM filter
- FlyLight (49) Apply FlyLight filter
- GENIE (45) Apply GENIE filter
- Integrative Imaging (3) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (26) Apply Tool Translation Team (T3) filter
- Transcription Imaging (45) Apply Transcription Imaging filter
Associated Support Team
- Project Pipeline Support (4) Apply Project Pipeline Support filter
- Anatomy and Histology (18) Apply Anatomy and Histology filter
- Cryo-Electron Microscopy (34) Apply Cryo-Electron Microscopy filter
- Electron Microscopy (15) Apply Electron Microscopy filter
- Gene Targeting and Transgenics (11) Apply Gene Targeting and Transgenics filter
- Integrative Imaging (17) Apply Integrative Imaging filter
- Invertebrate Shared Resource (40) Apply Invertebrate Shared Resource filter
- Janelia Experimental Technology (37) Apply Janelia Experimental Technology filter
- Management Team (1) Apply Management Team filter
- Molecular Genomics (15) Apply Molecular Genomics filter
- Primary & iPS Cell Culture (14) Apply Primary & iPS Cell Culture filter
- Project Technical Resources (49) Apply Project Technical Resources filter
- Quantitative Genomics (19) Apply Quantitative Genomics filter
- Scientific Computing Software (91) Apply Scientific Computing Software filter
- Scientific Computing Systems (6) Apply Scientific Computing Systems filter
- Viral Tools (14) Apply Viral Tools filter
- Vivarium (7) Apply Vivarium filter
Publication Date
- 2025 (102) Apply 2025 filter
- 2024 (218) Apply 2024 filter
- 2023 (159) Apply 2023 filter
- 2022 (167) Apply 2022 filter
- 2021 (175) Apply 2021 filter
- 2020 (177) Apply 2020 filter
- 2019 (177) Apply 2019 filter
- 2018 (206) Apply 2018 filter
- 2017 (186) Apply 2017 filter
- 2016 (191) Apply 2016 filter
- 2015 (195) Apply 2015 filter
- 2014 (190) Apply 2014 filter
- 2013 (136) Apply 2013 filter
- 2012 (112) Apply 2012 filter
- 2011 (98) Apply 2011 filter
- 2010 (61) Apply 2010 filter
- 2009 (56) Apply 2009 filter
- 2008 (40) Apply 2008 filter
- 2007 (21) Apply 2007 filter
- 2006 (3) Apply 2006 filter
2670 Janelia Publications
Showing 1221-1230 of 2670 resultsNervous systems have the ability to select appropriate actions and action sequences in response to sensory cues. The circuit mechanisms by which nervous systems achieve choice, stability and transitions between behaviors are still incompletely understood. To identify neurons and brain areas involved in controlling these processes, we combined a large-scale neuronal inactivation screen with automated action detection in response to a mechanosensory cue in Drosophila larva. We analyzed behaviors from 2.9x105 larvae and identified 66 candidate lines for mechanosensory responses out of which 25 for competitive interactions between actions. We further characterize in detail the neurons in these lines and analyzed their connectivity using electron microscopy. We found the neurons in the mechanosensory network are located in different regions of the nervous system consistent with a distributed model of sensorimotor decision-making. These findings provide the basis for understanding how selection and transition between behaviors are controlled by the nervous system.
The Drosophila brain is formed by an invariant set of lineages, each of which is derived from a unique neural stem cell (neuroblast) and forms a genetic and structural unit of the brain. The task of reconstructing brain circuitry at the level of individual neurons can be made significantly easier by assigning neurons to their respective lineages. In this article we address the automation of neuron and lineage identification. We focused on the Drosophila brain lineages at the larval stage when they form easily recognizable secondary axon tracts (SATs) that were previously partially characterized. We now generated an annotated digital database containing all lineage tracts reconstructed from five registered wild-type brains, at higher resolution and including some that were previously not characterized. We developed a method for SAT structural comparisons based on a dynamic programming approach akin to nucleotide sequence alignment and a machine learning classifier trained on the annotated database of reference SATs. We quantified the stereotypy of SATs by measuring the residual variability of aligned wild-type SATs. Next, we used our method for the identification of SATs within wild-type larval brains, and found it highly accurate (93-99%). The method proved highly robust for the identification of lineages in mutant brains and in brains that differed in developmental time or labeling. We describe for the first time an algorithm that quantifies neuronal projection stereotypy in the Drosophila brain and use the algorithm for automatic neuron and lineage recognition.
In this work, we propose a learning framework for identifying synapses using a deep and wide multi-scale recursive (DAWMR) network, previously considered in image segmentation applications. We apply this approach on electron microscopy data from invertebrate fly brain tissue. By learning features directly from the data, we are able to achieve considerable improvements over existing techniques that rely on a small set of hand-designed features. We show that this system can reduce the amount of manual annotation required, in both acquisition of training data as well as verification of inferred detections.
Idiosyncratic tendency to choose one alternative over others in the absence of an identified reason, is a common observation in two-alternative forced-choice experiments. It is tempting to account for it as resulting from the (unknown) participant-specific history and thus treat it as a measurement noise. Indeed, idiosyncratic choice biases are typically considered as nuisance. Care is taken to account for them by adding an ad-hoc bias parameter or by counterbalancing the choices to average them out. Here we quantify idiosyncratic choice biases in a perceptual discrimination task and a motor task. We report substantial and significant biases in both cases. Then, we present theoretical evidence that even in idealized experiments, in which the settings are symmetric, idiosyncratic choice bias is expected to emerge from the dynamics of competing neuronal networks. We thus argue that idiosyncratic choice bias reflects the microscopic dynamics of choice and therefore is virtually inevitable in any comparison or decision task.
Individuals vary in their innate behaviors, even when they have the same genome and have been reared in the same environment. The extent of individuality in plastic behaviors, like learning, is less well characterized. Also unknown is the extent to which intragenotypic differences in learning generalize: if an individual performs well in one assay, will it perform well in other assays? We investigated this using the fruit fly Drosophila melanogaster, an organism long-used to study the mechanistic basis of learning and memory. We found that isogenic flies, reared in identical lab conditions, and subject to classical conditioning that associated odorants with electric shock, exhibit clear individuality in their learning responses. Flies that performed well when an odor was paired with shock tended to perform well when other odors were paired with shock, or when the original odor was paired with bitter taste. Thus, individuality in learning performance appears to be prominent in isogenic animals reared identically, and individual differences in learning performance generalize across stimulus modalities. Establishing these results in flies opens up the possibility of studying the genetic and neural circuit basis of individual differences in learning in a highly suitable model organism.
Individuals vary in their innate behaviours, even when they have the same genome and have been reared in the same environment. The extent of individuality in plastic behaviours, like learning, is less well characterized. Also unknown is the extent to which intragenotypic differences in learning generalize: if an individual performs well in one assay, will it perform well in other assays? We investigated this using the fruit fly , an organism long-used to study the mechanistic basis of learning and memory. We found that isogenic flies, reared in identical laboratory conditions, and subject to classical conditioning that associated odorants with electric shock, exhibit clear individuality in their learning responses. Flies that performed well when an odour was paired with shock tended to perform well when the odour was paired with bitter taste or when other odours were paired with shock. Thus, individuality in learning performance appears to be prominent in isogenic animals reared identically, and individual differences in learning performance generalize across some aversive sensory modalities. Establishing these results in flies opens up the possibility of studying the genetic and neural circuit basis of individual differences in learning in a highly suitable model organism.
Innate behavioral biases and preferences can vary significantly among individuals of the same genotype. Though individuality is a fundamental property of behavior, it is not currently understood how individual differences in brain structure and physiology produce idiosyncratic behaviors. Here we present evidence for idiosyncrasy in olfactory behavior and neural responses in We show that individual female from a highly inbred laboratory strain exhibit idiosyncratic odor preferences that persist for days. We used in vivo calcium imaging of neural responses to compare projection neuron (second-order neurons that convey odor information from the sensory periphery to the central brain) responses to the same odors across animals. We found that, while odor responses appear grossly stereotyped, upon closer inspection, many individual differences are apparent across antennal lobe (AL) glomeruli (compact microcircuits corresponding to different odor channels). Moreover, we show that neuromodulation, environmental stress in the form of altered nutrition, and activity of certain AL local interneurons affect the magnitude of interfly behavioral variability. Taken together, this work demonstrates that individual exhibit idiosyncratic olfactory preferences and idiosyncratic neural responses to odors, and that behavioral idiosyncrasies are subject to neuromodulation and regulation by neurons in the AL.
Information processing relies on precise patterns of synapses between neurons. The cellular recognition mechanisms regulating this specificity are poorly understood. In the medulla of the Drosophila visual system, different neurons form synaptic connections in different layers. Here, we sought to identify candidate cell recognition molecules underlying this specificity. Using RNA sequencing (RNA-seq), we show that neurons with different synaptic specificities express unique combinations of mRNAs encoding hundreds of cell surface and secreted proteins. Using RNA-seq and protein tagging, we demonstrate that 21 paralogs of the Dpr family, a subclass of immunoglobulin (Ig)-domain containing proteins, are expressed in unique combinations in homologous neurons with different layer-specific synaptic connections. Dpr interacting proteins (DIPs), comprising nine paralogs of another subclass of Ig-containing proteins, are expressed in a complementary layer-specific fashion in a subset of synaptic partners. We propose that pairs of Dpr/DIP paralogs contribute to layer-specific patterns of synaptic connectivity.
Monitoring GABAergic inhibition in the nervous system has been enabled by development of an intensiometric molecular sensor that directly detects GABA. However the first generation iGABASnFR exhibits low signal-to-noise and suboptimal kinetics, making in vivo experiments challenging. To improve sensor performance, we targeted several sites in the protein for near-saturation mutagenesis, and evaluated the resulting sensor variants in a high throughput screening system using evoked synaptic release in primary cultured neurons. This identified a sensor variant, iGABASnFR2, with 4.2-fold improved sensitivity and 20% faster kinetics, and binding affinity that remained in a range sensitive to changes in GABA concentration at synapses. We also identified sensors with an inverted response, decreasing fluorescence intensity upon GABA binding. We termed the best such negative-going sensor iGABASnFR2n, which can be used to corroborate observations with the positive-going sensor. These improvements yielded a qualitative enhancement of in vivo performance, enabling us to make the first measurements of direction selective GABA release in the retina and confirm a longstanding hypothesis for how sensitivity to motion arises in the visual system.
We present ilastik, an easy-to-use interactive tool that brings machine-learning-based (bio)image analysis to end users without substantial computational expertise. It contains pre-defined workflows for image segmentation, object classification, counting and tracking. Users adapt the workflows to the problem at hand by interactively providing sparse training annotations for a nonlinear classifier. ilastik can process data in up to five dimensions (3D, time and number of channels). Its computational back end runs operations on-demand wherever possible, allowing for interactive prediction on data larger than RAM. Once the classifiers are trained, ilastik workflows can be applied to new data from the command line without further user interaction. We describe all ilastik workflows in detail, including three case studies and a discussion on the expected performance.