Filter
Associated Lab
- Aguilera Castrejon Lab (14) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (11) Apply Ahrens Lab filter
- Baker Lab (19) Apply Baker Lab filter
- Betzig Lab (12) Apply Betzig Lab filter
- Beyene Lab (5) Apply Beyene Lab filter
- Bock Lab (3) Apply Bock Lab filter
- Branson Lab (3) Apply Branson Lab filter
- Card Lab (6) Apply Card Lab filter
- Cardona Lab (19) Apply Cardona Lab filter
- Chklovskii Lab (3) Apply Chklovskii Lab filter
- Clapham Lab (1) Apply Clapham Lab filter
- Darshan Lab (4) Apply Darshan Lab filter
- Dennis Lab (1) Apply Dennis Lab filter
- Dickson Lab (14) Apply Dickson Lab filter
- Druckmann Lab (4) Apply Druckmann Lab filter
- Dudman Lab (12) Apply Dudman Lab filter
- Egnor Lab (7) Apply Egnor Lab filter
- Espinosa Medina Lab (4) Apply Espinosa Medina Lab filter
- Fetter Lab (10) Apply Fetter Lab filter
- Fitzgerald Lab (13) Apply Fitzgerald Lab filter
- Gonen Lab (32) Apply Gonen Lab filter
- Grigorieff Lab (28) Apply Grigorieff Lab filter
- Harris Lab (10) Apply Harris Lab filter
- Heberlein Lab (81) Apply Heberlein Lab filter
- Hermundstad Lab (4) Apply Hermundstad Lab filter
- Hess Lab (3) Apply Hess Lab filter
- Jayaraman Lab (4) Apply Jayaraman Lab filter
- Johnson Lab (5) Apply Johnson Lab filter
- Kainmueller Lab (19) Apply Kainmueller Lab filter
- Karpova Lab (1) Apply Karpova Lab filter
- Keleman Lab (5) Apply Keleman Lab filter
- Keller Lab (15) Apply Keller Lab filter
- Koay Lab (16) Apply Koay Lab filter
- Lavis Lab (12) Apply Lavis Lab filter
- Lee (Albert) Lab (5) Apply Lee (Albert) Lab filter
- Leonardo Lab (4) Apply Leonardo Lab filter
- Li Lab (24) Apply Li Lab filter
- Lippincott-Schwartz Lab (72) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (5) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (5) Apply Liu (Zhe) Lab filter
- Looger Lab (1) Apply Looger Lab filter
- Magee Lab (18) Apply Magee Lab filter
- Menon Lab (6) Apply Menon Lab filter
- Murphy Lab (7) Apply Murphy Lab filter
- O'Shea Lab (1) Apply O'Shea Lab filter
- Otopalik Lab (12) Apply Otopalik Lab filter
- Pachitariu Lab (12) Apply Pachitariu Lab filter
- Pastalkova Lab (13) Apply Pastalkova Lab filter
- Pavlopoulos Lab (12) Apply Pavlopoulos Lab filter
- Pedram Lab (11) Apply Pedram Lab filter
- Reiser Lab (6) Apply Reiser Lab filter
- Riddiford Lab (24) Apply Riddiford Lab filter
- Romani Lab (12) Apply Romani Lab filter
- Rubin Lab (38) Apply Rubin Lab filter
- Saalfeld Lab (17) Apply Saalfeld Lab filter
- Satou Lab (15) Apply Satou Lab filter
- Schreiter Lab (17) Apply Schreiter Lab filter
- Sgro Lab (20) Apply Sgro Lab filter
- Simpson Lab (5) Apply Simpson Lab filter
- Singer Lab (43) Apply Singer Lab filter
- Spruston Lab (36) Apply Spruston Lab filter
- Stern Lab (83) Apply Stern Lab filter
- Sternson Lab (7) Apply Sternson Lab filter
- Stringer Lab (3) Apply Stringer Lab filter
- Svoboda Lab (4) Apply Svoboda Lab filter
- Tebo Lab (24) Apply Tebo Lab filter
- Tillberg Lab (3) Apply Tillberg Lab filter
- Tjian Lab (47) Apply Tjian Lab filter
- Truman Lab (30) Apply Truman Lab filter
- Turaga Lab (12) Apply Turaga Lab filter
- Turner Lab (11) Apply Turner Lab filter
- Wang (Shaohe) Lab (19) Apply Wang (Shaohe) Lab filter
- Wu Lab (1) Apply Wu Lab filter
- Zlatic Lab (2) Apply Zlatic Lab filter
- Zuker Lab (20) Apply Zuker Lab filter
Associated Project Team
Publication Date
- 2023 (1) Apply 2023 filter
- 2022 (26) Apply 2022 filter
- 2021 (19) Apply 2021 filter
- 2020 (19) Apply 2020 filter
- 2019 (25) Apply 2019 filter
- 2018 (26) Apply 2018 filter
- 2017 (31) Apply 2017 filter
- 2016 (18) Apply 2016 filter
- 2015 (57) Apply 2015 filter
- 2014 (46) Apply 2014 filter
- 2013 (58) Apply 2013 filter
- 2012 (78) Apply 2012 filter
- 2011 (92) Apply 2011 filter
- 2010 (100) Apply 2010 filter
- 2009 (102) Apply 2009 filter
- 2008 (100) Apply 2008 filter
- 2007 (85) Apply 2007 filter
- 2006 (89) Apply 2006 filter
- 2005 (67) Apply 2005 filter
- 2004 (57) Apply 2004 filter
- 2003 (58) Apply 2003 filter
- 2002 (39) Apply 2002 filter
- 2001 (28) Apply 2001 filter
- 2000 (29) Apply 2000 filter
- 1999 (14) Apply 1999 filter
- 1998 (18) Apply 1998 filter
- 1997 (16) Apply 1997 filter
- 1996 (10) Apply 1996 filter
- 1995 (18) Apply 1995 filter
- 1994 (12) Apply 1994 filter
- 1993 (10) Apply 1993 filter
- 1992 (6) Apply 1992 filter
- 1991 (11) Apply 1991 filter
- 1990 (11) Apply 1990 filter
- 1989 (6) Apply 1989 filter
- 1988 (1) Apply 1988 filter
- 1987 (7) Apply 1987 filter
- 1986 (4) Apply 1986 filter
- 1985 (5) Apply 1985 filter
- 1984 (2) Apply 1984 filter
- 1983 (2) Apply 1983 filter
- 1982 (3) Apply 1982 filter
- 1981 (3) Apply 1981 filter
- 1980 (1) Apply 1980 filter
- 1979 (1) Apply 1979 filter
- 1976 (2) Apply 1976 filter
- 1973 (1) Apply 1973 filter
- 1970 (1) Apply 1970 filter
- 1967 (1) Apply 1967 filter
Type of Publication
- Remove Non-Janelia filter Non-Janelia
1416 Publications
Showing 1-10 of 1416 resultsPURPOSE: This paper describes an approach for the three-dimensional (3D) shape and pose reconstruction of the human rib cage from few segmented two-dimensional (2D) projection images. Our work is aimed at supporting temporal subtraction techniques of subsequently acquired radiographs by establishing a method for the assessment of pose differences in sequences of chest radiographs of the same patient. METHODS: The reconstruction method is based on a 3D statistical shape model (SSM) of the rib cage, which is adapted to binary 2D projection images of an individual rib cage. To drive the adaptation we minimize a distance measure that quantifies the dissimilarities between 2D projections of the 3D SSM and the projection images of the individual rib cage. We propose different silhouette-based distance measures and evaluate their suitability for the adaptation of the SSM to the projection images. RESULTS: An evaluation was performed on 29 sets of biplanar binary images (posterior-anterior and lateral). Depending on the chosen distance measure, our experiments on the combined reconstruction of shape and pose of the rib cages yield reconstruction errors from 2.2 to 4.7 mm average mean 3D surface distance. Given a geometry of an individual rib cage, the rotational errors for the pose reconstruction range from 0.1 degrees to 0.9 degrees. CONCLUSIONS: The results show that our method is suitable for the estimation of pose differences of the human rib cage in binary projection images. Thus, it is able to provide crucial 3D information for registration during the generation of 2D subtraction images.
Cryogenic electron tomography (cryo-ET) has gained increasing interest in recent years due to its ability to image whole cells and subcellular structures in 3D at nanometer resolution in their native environment. However, due to dose restrictions and the inability to acquire high tilt angle images, the reconstructed volumes are noisy and have missing information. Thus, features are unreliable, and precision extraction of the cell boundary is difficult, manual and time intensive. This paper presents an efficient recursive algorithm called BLASTED (Boundary Localization using Adaptive Shape and Texture Discovery) to automatically extract the cell boundary using a conditional random field (CRF) framework in which boundary points and shape are jointly inferred. The algorithm learns the texture of the boundary region progressively, and uses a global shape model and shape-dependent features to propose candidate boundary points on a slice of the membrane. It then updates the shape of that slice by accepting the appropriate candidate points using local spatial clustering, the global shape model, and trained boosted texture classifiers. The BLASTED algorithm segmented the cell membrane over an average of 93% of the length of the cell in 19 difficult cryo-ET datasets.
To explore the role of Bid protein in the mitochondria and endoplasmic reticulum (ER) associated apoptotic pathway.
Dopaminergic neurons in mammals respond to rewards and reward-predicting cues, and are thought to play an important role in learning actions or sensory cues that lead to reward. The anatomical sources of input that drive or modulate such responses are not well understood; these ultimately define the range of behavior to which dopaminergic neurons contribute. Primary rewards are not the immediate objective of all goal-directed behavior. For example, a goal of vocal learning is to imitate vocal-communication signals. Here, we demonstrate activation of dopaminergic neurons in songbirds driven by a basal ganglia region required for vocal learning, area X. Dopaminergic neurons in anesthetized zebra finches respond more strongly to the bird’s own song (BOS) than to other sounds, and area X is critical for these responses. Direct pharmacological modulation of area X output, in the absence of auditory stimulation, is sufficient to bidirectionally modulate the firing rate of dopaminergic neurons. The only known pathway from song control regions to dopaminergic neurons involves a projection from area X to the ventral pallidum (VP), which in turn projects to dopaminergic regions. We show that VP neurons are spontaneously active and inhibited preferentially by BOS, suggesting that area X disinhibits dopaminergic neurons by inhibiting VP. Supporting this model, auditory-response latencies are shorter in area X than VP, and shorter in VP than dopaminergic neurons. Thus, dopaminergic neurons can be disinhibited selectively by complex sensory stimuli via input from the basal ganglia. The functional pathway we identify may allow dopaminergic neurons to contribute to vocal learning.
Research into the neural mechanisms of place navigation in laboratory animals has led to the definition of allothetic and idiothetic navigation modes that can be examined by quantitative analysis of the generated tracks. In an attempt to use this approach in the study of human navigation behavior, 10 young subjects were examined in an enclosed arena (2.9 m in diameter, 3 m high) equipped with a computerized tracking system. Idiothetic navigation was studied in blindfolded subjects performing the following tasks-Simple Homing, Complex Homing and Idiothesis Supported by Floor-Related Signals. Allothetic navigation was examined in sighted subjects instructed to find in an empty arena the acoustically signaled unmarked goal region and later to retrieve its position using tasks (Natural Navigation, Cue-Controlled Navigation, Snapshot Memory, Map Reading) that evaluated different aspects of allothesis. The results indicate that allothetic navigation is more accurate than idiothetic, that the poor accuracy of idiothesis is due to angular rather than to distance errors, and that navigation performance is best when both allothetic and idiothetic modes contribute to the solution of the task. The proposed test battery may contribute to better understanding of the navigation disturbances accompanying various neurological disorders and to objective evaluation of the results of drug therapy and of rehabilitation procedures.
Most methods for structure-function analysis of the brain in medical images are usually based on voxel-wise statistical tests performed on registered magnetic resonance (MR) images across subjects. A major drawback of such methods is the inability to accurately locate regions that manifest nonlinear associations with clinical variables. In this paper, we propose Bayesian morphological analysis methods, based on a Bayesian-network representation, for the analysis of MR brain images. First, we describe how Bayesian networks (BNs) can represent probabilistic associations among voxels and clinical (function) variables. Second, we present a model-selection framework, which generates a BN that captures structure-function relationships from MR brain images and function variables. We demonstrate our methods in the context of determining associations between regional brain atrophy (as demonstrated on MR images of the brain), and functional deficits. We employ two data sets for this evaluation: the first contains MR images of 11 subjects, where associations between regional atrophy and a functional deficit are almost linear; the second data set contains MR images of the ventricles of 84 subjects, where the structure-function association is nonlinear. Our methods successfully identify voxel-wise morphological changes that are associated with functional deficits in both data sets, whereas standard statistical analysis (i.e., t-test and paired t-test) fails in the nonlinear-association case.
Metastasis depends upon cancer cell growth and survival within the metastatic niche. Tumors which remodel their glycocalyces, by overexpressing bulky glycoproteins like mucins, exhibit a higher predisposition to metastasize, but the role of mucins in oncogenesis remains poorly understood. Here we report that a bulky glycocalyx promotes the expansion of disseminated tumor cells in vivo by fostering integrin adhesion assembly to permit G1 cell cycle progression. We engineered tumor cells to display glycocalyces of various thicknesses by coating them with synthetic mucin-mimetic glycopolymers. Cells adorned with longer glycopolymers showed increased metastatic potential, enhanced cell cycle progression, and greater levels of integrin-FAK mechanosignaling and Akt signaling in a syngeneic mouse model of metastasis. These effects were mirrored by expression of the ectodomain of cancer-associated mucin MUC1. These findings functionally link mucinous proteins with tumor aggression, and offer a new view of the cancer glycocalyx as a major driver of disease progression.
The ability to generate variable movements is essential for learning and adjusting complex behaviours. This variability has been linked to the temporal irregularity of neuronal activity in the central nervous system. However, how neuronal irregularity actually translates into behavioural variability is unclear. Here we combine modelling, electrophysiological and behavioural studies to address this issue. We demonstrate that a model circuit comprising topographically organized and strongly recurrent neural networks can autonomously generate irregular motor behaviours. Simultaneous recordings of neurons in singing finches reveal that neural correlations increase across the circuit driving song variability, in agreement with the model predictions. Analysing behavioural data, we find remarkable similarities in the babbling statistics of 5-6-month-old human infants and juveniles from three songbird species and show that our model naturally accounts for these 'universal' statistics.
To make successful evidence-based decisions, the brain must rapidly and accurately transform sensory inputs into specific goal-directed behaviors. Most experimental work on this subject has focused on forebrain mechanisms. Here we show that during perceptual decision-making over a period of seconds, decision-, sensory-, and error-related information converge on the lateral posterior cerebellum in crus I, a structure that communicates bidirectionally with numerous forebrain regions. We trained mice on a novel evidence-accumulation task and demonstrated that cerebellar inactivation reduces behavioral accuracy without impairing motor parameters of action. Using two-photon calcium imaging, we found that Purkinje cell somatic activity encoded choice- and evidence-related variables. Decision errors were represented by dendritic calcium spikes, which are known to drive plasticity. We propose that cerebellar circuitry may contribute to the set of distributed computations in the brain that support accurate perceptual decision-making.
A novel family of candidate gustatory receptors (GRs) was recently identified in searches of the Drosophila genome. We have performed in situ hybridization and transgene experiments that reveal expression of these genes in both gustatory and olfactory neurons in adult flies and larvae. This gene family is likely to encode both odorant and taste receptors. We have visualized the projections of chemosensory neurons in the larval brain and observe that neurons expressing different GRs project to discrete loci in the antennal lobe and subesophageal ganglion. These data provide insight into the diversity of chemosensory recognition and an initial view of the representation of gustatory information in the fly brain.