Main Menu (Mobile)- Block

Main Menu - Block

custom | custom

Search Results

filters_region_cap | custom

Filter

facetapi-Q2b17qCsTdECvJIqZJgYMaGsr8vANl1n | block

Associated Lab

facetapi-W9JlIB1X0bjs93n1Alu3wHJQTTgDCBGe | block
facetapi-61yz1V0li8B1bixrCWxdAe2aYiEXdhd0 | block
facetapi-PV5lg7xuz68EAY8eakJzrcmwtdGEnxR0 | block
general_search_page-panel_pane_1 | views_panes

2492 Janelia Publications

Showing 2461-2470 of 2492 results
05/05/17 | What can tiny mushrooms in fruit flies tell us about learning and memory?
Hige T
Neuroscience Research. 2017 May 05;129:8-16. doi: 10.1016/j.neures.2017.05.002

Nervous systems have evolved to translate external stimuli into appropriate behavioral responses. In an ever-changing environment, flexible adjustment of behavioral choice by experience-dependent learning is essential for the animal's survival. Associative learning is a simple form of learning that is widely observed from worms to humans. To understand the whole process of learning, we need to know how sensory information is represented and transformed in the brain, how it is changed by experience, and how the changes are reflected on motor output. To tackle these questions, studying numerically simple invertebrate nervous systems has a great advantage. In this review, I will feature the Pavlovian olfactory learning in the fruit fly, Drosophila melanogaster. The mushroom body is a key brain area for the olfactory learning in this organism. Recently, comprehensive anatomical information and the genetic tool sets were made available for the mushroom body circuit. This greatly accelerated the physiological understanding of the learning process. One of the key findings was dopamine-induced long-term synaptic plasticity that can alter the representations of stimulus valence. I will mostly focus on the new studies within these few years and discuss what we can possibly learn about the vertebrate systems from this model organism.

View Publication Page
01/13/20 | When does midbrain dopamine activity exert its effects on behavior?
Coddington LT
Nature Neuroscience. 2020 Jan 13;23(2):154-6. doi: 10.1038/s41593-019-0577-y
Riddiford Lab
05/15/11 | When is weight critical?
Riddiford LM
The Journal of Experimental Biology. 2011 May 15;214(Pt 10):1613-5. doi: 10.1242/jeb.049098
03/15/22 | When light meets biology - how the specimen affects quantitative microscopy.
Reiche MA, Aaron JS, Boehm U, DeSantis MC, Hobson CM, Khuon S, Lee RM, Chew T
Journal of Cell Science. 2022 Mar 15;135(6):. doi: 10.1242/jcs.259656

Fluorescence microscopy images should not be treated as perfect representations of biology. Many factors within the biospecimen itself can drastically affect quantitative microscopy data. Whereas some sample-specific considerations, such as photobleaching and autofluorescence, are more commonly discussed, a holistic discussion of sample-related issues (which includes less-routine topics such as quenching, scattering and biological anisotropy) is required to appropriately guide life scientists through the subtleties inherent to bioimaging. Here, we consider how the interplay between light and a sample can cause common experimental pitfalls and unanticipated errors when drawing biological conclusions. Although some of these discrepancies can be minimized or controlled for, others require more pragmatic considerations when interpreting image data. Ultimately, the power lies in the hands of the experimenter. The goal of this Review is therefore to survey how biological samples can skew quantification and interpretation of microscopy data. Furthermore, we offer a perspective on how to manage many of these potential pitfalls.

View Publication Page
05/19/21 | Which image-based phenotypes are most promising for using AI to understand cellular functions and why?
Lundberg E, Funke J, Uhlmann V, Gerlich D, Walter T, Carpenter A, Coehlo LP
Cell Systems. 2021 May 19;12(5):384-387. doi: 10.1016/j.cels.2021.04.012
Svoboda Lab
02/16/15 | Whisking.
Sofroniew NJ, Svoboda K
Current Biology. 2015 Feb 16;25(4):R137-40. doi: 10.1016/j.cub.2015.01.008

Eyes may be 'the window to the soul' in humans, but whiskers provide a better path to the inner lives of rodents. The brain has remarkable abilities to focus its limited resources on information that matters, while ignoring a cacophony of distractions. While inspecting a visual scene, primates foveate to multiple salient locations, for example mouths and eyes in images of people, and ignore the rest. Similar processes have now been observed and studied in rodents in the context of whisker-based tactile sensation. Rodents use their mechanosensitive whiskers for a diverse range of tactile behaviors such as navigation, object recognition and social interactions. These animals move their whiskers in a purposive manner to locations of interest. The shapes of whiskers, as well as their movements, are exquisitely adapted for tactile exploration in the dark tight burrows where many rodents live. By studying whisker movements during tactile behaviors, we can learn about the tactile information available to rodents through their whiskers and how rodents direct their attention. In this primer, we focus on how the whisker movements of rats and mice are providing clues about the logic of active sensation and the underlying neural mechanisms.

View Publication Page
Cardona LabFunke Lab
11/18/15 | Who is talking to whom: Synaptic partner detection in anisotropic volumes of insect brain.
Kreshuk A, Funke J, Cardona A, Hamprecht FA
Medical Image Computing and Computer-Assisted Intervention -- MICCAI 2015:661-8. doi: 10.1007/978-3-319-24553-9_81

Automated reconstruction of neural connectivity graphs from electron microscopy image stacks is an essential step towards large-scale neural circuit mapping. While significant progress has recently been made in automated segmentation of neurons and detection of synapses, the problem of synaptic partner assignment for polyadic (one-to-many) synapses, prevalent in the Drosophila brain, remains unsolved. In this contribution, we propose a method which automatically assigns pre- and postsynaptic roles to neurites adjacent to a synaptic site. The method constructs a probabilistic graphical model over potential synaptic partner pairs which includes factors to account for a high rate of one-to-many connections, as well as the possibility of the same neuron to be pre-synaptic in one synapse and post-synaptic in another. The algorithm has been validated on a publicly available stack of ssTEM images of Drosophila neural tissue and has been shown to reconstruct most of the synaptic relations correctly.

View Publication Page
10/26/15 | Whole-animal functional and developmental imaging with isotropic spatial resolution
Chhetri RK, Amat F, Wan Y, Höckendorf B, Lemon WC, Keller PJ
Nature Methods. 2015 Oct 26;12(12):1171-8. doi: 10.1038/nmeth.3632

Imaging fast cellular dynamics across large specimens requires high resolution in all dimensions, high imaging speeds, good physical coverage and low photo-damage. To meet these requirements, we developed isotropic multiview (IsoView) light-sheet microscopy, which rapidly images large specimens via simultaneous light-sheet illumination and fluorescence detection along four orthogonal directions. Combining these four views by means of high-throughput multiview deconvolution yields images with high resolution in all three dimensions. We demonstrate whole-animal functional imaging of Drosophila larvae at a spatial resolution of 1.1-2.5 μm and temporal resolution of 2 Hz for several hours. We also present spatially isotropic whole-brain functional imaging in Danio rerio larvae and spatially isotropic multicolor imaging of fast cellular dynamics across gastrulating Drosophila embryos. Compared with conventional light-sheet microscopy, IsoView microscopy improves spatial resolution at least sevenfold and decreases resolution anisotropy at least threefold. Compared with existing high-resolution light-sheet techniques, IsoView microscopy effectively doubles the penetration depth and provides subsecond temporal resolution for specimens 400-fold larger than could previously be imaged.

View Publication Page
07/18/16 | Whole-animal imaging with high spatio-temporal resolution.
Chhetri R, Amat F, Wan Y, Höckendorf B, Lemon WC, Keller PJ
Proceedings of SPIE. 2016 Jul 18;9720:97200R. doi: 10.1117/12.2212564

We developed isotropic multiview (IsoView) light-sheet microscopy in order to image fast cellular dynamics, such as cell movements in an entire developing embryo or neuronal activity throughput an entire brain or nervous system, with high resolution in all dimensions, high imaging speeds, good physical coverage and low photo-damage. To achieve high temporal resolution and high spatial resolution at the same time, IsoView microscopy rapidly images large specimens via simultaneous light-sheet illumination and fluorescence detection along four orthogonal directions. In a post-processing step, these four views are then combined by means of high-throughput multiview deconvolution to yield images with a system resolution of ≤ 450 nm in all three dimensions. Using IsoView microscopy, we performed whole-animal functional imaging of Drosophila embryos and larvae at a spatial resolution of 1.1-2.5 μm and at a temporal resolution of 2 Hz for up to 9 hours. We also performed whole-brain functional imaging in larval zebrafish and multicolor imaging of fast cellular dynamics across entire, gastrulating Drosophila embryos with isotropic, sub-cellular resolution. Compared with conventional (spatially anisotropic) light-sheet microscopy, IsoView microscopy improves spatial resolution at least sevenfold and decreases resolution anisotropy at least threefold. Compared with existing high-resolution light-sheet techniques, such as lattice lightsheet microscopy or diSPIM, IsoView microscopy effectively doubles the penetration depth and provides subsecond temporal resolution for specimens 400-fold larger than could previously be imaged.

View Publication Page
03/14/24 | Whole-body simulation of realistic fruit fly locomotion with deep reinforcement learning
Roman Vaxenburg , Igor Siwanowicz , Josh Merel , Alice A Robie , Carmen Morrow , Guido Novati , Zinovia Stefanidi , Gwyneth M Card , Michael B Reiser , Matthew M Botvinick , Kristin M Branson , Yuval Tassa , Srinivas C Turaga
bioRxiv. 2024 Mar 14:. doi: 10.1101/2024.03.11.584515

The body of an animal determines how the nervous system produces behavior. Therefore, detailed modeling of the neural control of sensorimotor behavior requires a detailed model of the body. Here we contribute an anatomically-detailed biomechanical whole-body model of the fruit fly Drosophila melanogaster in the MuJoCo physics engine. Our model is general-purpose, enabling the simulation of diverse fly behaviors, both on land and in the air. We demonstrate the generality of our model by simulating realistic locomotion, both flight and walking. To support these behaviors, we have extended MuJoCo with phenomenological models of fluid forces and adhesion forces. Through data-driven end-to-end reinforcement learning, we demonstrate that these advances enable the training of neural network controllers capable of realistic locomotion along complex trajectories based on high-level steering control signals. With a visually guided flight task, we demonstrate a neural controller that can use the vision sensors of the body model to control and steer flight. Our project is an open-source platform for modeling neural control of sensorimotor behavior in an embodied context.Competing Interest StatementThe authors have declared no competing interest.

View Publication Page