Filter
Associated Lab
- Aguilera Castrejon Lab (16) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (63) Apply Ahrens Lab filter
- Aso Lab (40) Apply Aso Lab filter
- Baker Lab (38) Apply Baker Lab filter
- Betzig Lab (112) Apply Betzig Lab filter
- Beyene Lab (13) Apply Beyene Lab filter
- Bock Lab (17) Apply Bock Lab filter
- Branson Lab (52) Apply Branson Lab filter
- Card Lab (41) Apply Card Lab filter
- Cardona Lab (63) Apply Cardona Lab filter
- Chklovskii Lab (13) Apply Chklovskii Lab filter
- Clapham Lab (14) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (12) Apply Darshan Lab filter
- Dennis Lab (1) Apply Dennis Lab filter
- Dickson Lab (46) Apply Dickson Lab filter
- Druckmann Lab (25) Apply Druckmann Lab filter
- Dudman Lab (50) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (11) Apply Egnor Lab filter
- Espinosa Medina Lab (19) Apply Espinosa Medina Lab filter
- Feliciano Lab (7) Apply Feliciano Lab filter
- Fetter Lab (41) Apply Fetter Lab filter
- Fitzgerald Lab (29) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (38) Apply Funke Lab filter
- Gonen Lab (91) Apply Gonen Lab filter
- Grigorieff Lab (62) Apply Grigorieff Lab filter
- Harris Lab (60) Apply Harris Lab filter
- Heberlein Lab (94) Apply Heberlein Lab filter
- Hermundstad Lab (26) Apply Hermundstad Lab filter
- Hess Lab (77) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (46) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (6) Apply Johnson Lab filter
- Kainmueller Lab (19) Apply Kainmueller Lab filter
- Karpova Lab (14) Apply Karpova Lab filter
- Keleman Lab (13) Apply Keleman Lab filter
- Keller Lab (76) Apply Keller Lab filter
- Koay Lab (18) Apply Koay Lab filter
- Lavis Lab (148) Apply Lavis Lab filter
- Lee (Albert) Lab (34) Apply Lee (Albert) Lab filter
- Leonardo Lab (23) Apply Leonardo Lab filter
- Li Lab (28) Apply Li Lab filter
- Lippincott-Schwartz Lab (168) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (6) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (61) Apply Liu (Zhe) Lab filter
- Looger Lab (138) Apply Looger Lab filter
- Magee Lab (49) Apply Magee Lab filter
- Menon Lab (18) Apply Menon Lab filter
- Murphy Lab (13) Apply Murphy Lab filter
- O'Shea Lab (6) Apply O'Shea Lab filter
- Otopalik Lab (13) Apply Otopalik Lab filter
- Pachitariu Lab (47) Apply Pachitariu Lab filter
- Pastalkova Lab (18) Apply Pastalkova Lab filter
- Pavlopoulos Lab (19) Apply Pavlopoulos Lab filter
- Pedram Lab (15) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (51) Apply Reiser Lab filter
- Riddiford Lab (44) Apply Riddiford Lab filter
- Romani Lab (43) Apply Romani Lab filter
- Rubin Lab (143) Apply Rubin Lab filter
- Saalfeld Lab (63) Apply Saalfeld Lab filter
- Satou Lab (16) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (67) Apply Schreiter Lab filter
- Sgro Lab (21) Apply Sgro Lab filter
- Shroff Lab (30) Apply Shroff Lab filter
- Simpson Lab (23) Apply Simpson Lab filter
- Singer Lab (80) Apply Singer Lab filter
- Spruston Lab (93) Apply Spruston Lab filter
- Stern Lab (156) Apply Stern Lab filter
- Sternson Lab (54) Apply Sternson Lab filter
- Stringer Lab (35) Apply Stringer Lab filter
- Svoboda Lab (135) Apply Svoboda Lab filter
- Tebo Lab (33) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (21) Apply Tillberg Lab filter
- Tjian Lab (64) Apply Tjian Lab filter
- Truman Lab (88) Apply Truman Lab filter
- Turaga Lab (50) Apply Turaga Lab filter
- Turner Lab (37) Apply Turner Lab filter
- Vale Lab (7) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (18) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (25) Apply Wang (Shaohe) Lab filter
- Wu Lab (9) Apply Wu Lab filter
- Zlatic Lab (28) Apply Zlatic Lab filter
- Zuker Lab (25) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (2) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (10) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (53) Apply FlyEM filter
- FlyLight (49) Apply FlyLight filter
- GENIE (45) Apply GENIE filter
- Integrative Imaging (3) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (26) Apply Tool Translation Team (T3) filter
- Transcription Imaging (49) Apply Transcription Imaging filter
Publication Date
- 2025 (92) Apply 2025 filter
- 2024 (221) Apply 2024 filter
- 2023 (160) Apply 2023 filter
- 2022 (193) Apply 2022 filter
- 2021 (194) Apply 2021 filter
- 2020 (196) Apply 2020 filter
- 2019 (202) Apply 2019 filter
- 2018 (232) Apply 2018 filter
- 2017 (217) Apply 2017 filter
- 2016 (209) Apply 2016 filter
- 2015 (252) Apply 2015 filter
- 2014 (236) Apply 2014 filter
- 2013 (194) Apply 2013 filter
- 2012 (190) Apply 2012 filter
- 2011 (190) Apply 2011 filter
- 2010 (161) Apply 2010 filter
- 2009 (158) Apply 2009 filter
- 2008 (140) Apply 2008 filter
- 2007 (106) Apply 2007 filter
- 2006 (92) Apply 2006 filter
- 2005 (67) Apply 2005 filter
- 2004 (57) Apply 2004 filter
- 2003 (58) Apply 2003 filter
- 2002 (39) Apply 2002 filter
- 2001 (28) Apply 2001 filter
- 2000 (29) Apply 2000 filter
- 1999 (14) Apply 1999 filter
- 1998 (18) Apply 1998 filter
- 1997 (16) Apply 1997 filter
- 1996 (10) Apply 1996 filter
- 1995 (18) Apply 1995 filter
- 1994 (12) Apply 1994 filter
- 1993 (10) Apply 1993 filter
- 1992 (6) Apply 1992 filter
- 1991 (11) Apply 1991 filter
- 1990 (11) Apply 1990 filter
- 1989 (6) Apply 1989 filter
- 1988 (1) Apply 1988 filter
- 1987 (7) Apply 1987 filter
- 1986 (4) Apply 1986 filter
- 1985 (5) Apply 1985 filter
- 1984 (2) Apply 1984 filter
- 1983 (2) Apply 1983 filter
- 1982 (3) Apply 1982 filter
- 1981 (3) Apply 1981 filter
- 1980 (1) Apply 1980 filter
- 1979 (1) Apply 1979 filter
- 1976 (2) Apply 1976 filter
- 1973 (1) Apply 1973 filter
- 1970 (1) Apply 1970 filter
- 1967 (1) Apply 1967 filter
Type of Publication
4079 Publications
Showing 2691-2700 of 4079 resultsOnconase (ONC) is a member of the ribonuclease A superfamily that is toxic to cancer cells in vitro and in vivo. ONC is now in Phase IIIb clinical trials for the treatment of malignant mesothelioma. Internalization of ONC to the cytosol of cancer cells is essential for its cytotoxic activity, despite the apparent absence of a cell-surface receptor protein. Endocytosis and cytotoxicity do, however, appear to correlate with the net positive charge of ribonucleases. To dissect the contribution made by the endogenous arginine and lysine residues of ONC to its cytotoxicity, 22 variants were created in which cationic residues were replaced with alanine. Variants with the same net charge (+2 to +5) as well as equivalent catalytic activity and conformational stability were found to exhibit large (> 10-fold) differences in toxicity for the cells of a human leukemia line. In addition, a more cationic ONC variant could be either much more or much less cytotoxic than a less cationic variant, again depending on the distribution of its cationic residues. The endocytosis of variants with widely divergent cytotoxic activity was quantified by flow cytometry using a small-molecule fluorogenic label, and was found to vary by twofold or less. This small difference in endocytosis did not account for the large difference in cytotoxicity, implicating the distribution of cationic residues as being critical for lipid-bilayer translocation subsequent to endocytosis. This finding has fundamental implications for understanding the interaction of ribonucleases and other proteins with mammalian cells.
Memory guides the choices an animal makes across widely varying conditions in dynamic environments. Consequently, the most adaptive choice depends on the options available. How can a single memory support optimal behavior across different sets of choice options? We address this using olfactory learning in Drosophila. Even when we restrict an odor-punishment association to a single set of synapses using optogenetics, we find that flies still show choice behavior that depends on the options it encounters. Here we show that how the odor choices are presented to the animal influences memory recall itself. Presenting two similar odors in sequence enabled flies to not only discriminate them behaviorally but also at the level of neural activity. However, when the same odors were encountered as solitary stimuli, no such differences were detectable. These results show that memory recall is not simply a comparison to a static learned template, but can be adaptively modulated by stimulus dynamics.
Behavioral neuroscience faces two conflicting demands: long-duration recordings from large neural populations and unimpeded animal behavior. To meet this challenge we developed ONIX, an open-source data acquisition system with high data throughput (2 GB s) and low closed-loop latencies (<1 ms) that uses a 0.3-mm thin tether to minimize behavioral impact. Head position and rotation are tracked in three dimensions and used to drive active commutation without torque measurements. ONIX can acquire data from combinations of passive electrodes, Neuropixels probes, head-mounted microscopes, cameras, three-dimensional trackers and other data sources. We performed uninterrupted, long (~7 h) neural recordings in mice as they traversed complex three-dimensional terrain, and multiday sleep-tracking recordings (~55 h). ONIX enabled exploration with similar mobility as nonimplanted animals, in contrast to conventional tethered systems, which have restricted movement. By combining long recordings with full mobility, our technology will enable progress on questions that require high-quality neural recordings during ethologically grounded behaviors.
Insect neural systems are a promising source of inspiration for new navigation algorithms, especially on low size, weight, and power platforms. There have been unprecedented recent neuroscience breakthroughs with Drosophila in behavioral and neural imaging experiments as well as the mapping of detailed connectivity of neural structures. General mechanisms for learning orientation in the central complex (CX) of Drosophila have been investigated previously; however, it is unclear how these underlying mechanisms extend to cases where there is translation through an environment (beyond only rotation), which is critical for navigation in robotic systems. Here, we develop a CX neural connectivity-constrained model that performs sensor fusion, as well as unsupervised learning of visual features for path integration; we demonstrate the viability of this circuit for use in robotic systems in simulated and physical environments. Furthermore, we propose a theoretical understanding of how distributed online unsupervised network weight modification can be leveraged for learning in a trajectory through an environment by minimizing orientation estimation error. Overall, our results may enable a new class of CX-derived low power robotic navigation algorithms and lead to testable predictions to inform future neuroscience experiments.
A group leader decided that his lab would share the fluorescent dyes they create, for free and without authorship requirements. Nearly 12,000 aliquots later, he reveals what has happened since.
New technologies for monitoring and manipulating the nervous system promise exciting biology but pose challenges for analysis and computation. Solutions can be found in the form of modern approaches to distributed computing, machine learning, and interactive visualization. But embracing these new technologies will require a cultural shift: away from independent efforts and proprietary methods and toward an open source and collaborative neuroscience.
Laboratory behavioural tasks are an essential research tool. As questions asked of behaviour and brain activity become more sophisticated, the ability to specify and run richly structured tasks becomes more important. An increasing focus on reproducibility also necessitates accurate communication of task logic to other researchers. To these ends, we developed pyControl, a system of open-source hardware and software for controlling behavioural experiments comprising a simple yet flexible Python-based syntax for specifying tasks as extended state machines, hardware modules for building behavioural setups, and a graphical user interface designed for efficiently running high-throughput experiments on many setups in parallel, all with extensive online documentation. These tools make it quicker, easier, and cheaper to implement rich behavioural tasks at scale. As important, pyControl facilitates communication and reproducibility of behavioural experiments through a highly readable task definition syntax and self-documenting features. Here, we outline the system's design and rationale, present validation experiments characterising system performance, and demonstrate example applications in freely moving and head-fixed mouse behaviour.
To smooth the academic-to-industry transition, one institution is experimenting with offering biomedical researchers pre-commercial open access to new optical imaging systems still under development. The approach, the authors of this case study suggest, can be a win on both sides.
Recent progress in intracellular calcium sensors and other fluorophores has promoted the widespread adoption of functional optical imaging in the life sciences. Home-built multiphoton microscopes are easy to build, highly customizable, and cost effective. For many imaging applications a 3-axis motorized stage is critical, but commercially available motorization hardware (motorized translators, controller boxes, etc) are often very expensive. Furthermore, the firmware on commercial motor controllers cannot easily be altered and is not usually designed with a microscope stage in mind. Here we describe an open-source motorization solution that is simple to construct, yet far cheaper and more customizable than commercial offerings. The cost of the controller and motorization hardware are under $1000. Hardware costs are kept low by replacing linear actuators with high quality stepper motors. Electronics are assembled from commonly available hobby components, which are easy to work with. Here we describe assembly of the system and quantify the positioning accuracy of all three axes. We obtain positioning repeatability of the order of 1 μm in X/Y and 0.1 μm in Z. A hand-held control-pad allows the user to direct stage motion precisely over a wide range of speeds (10(-1) to 10(2) μm·s(-1)), rapidly store and return to different locations, and execute "jumps" of a fixed size. In addition, the system can be controlled from a PC serial port. Our "OpenStage" controller is sufficiently flexible that it could be used to drive other devices, such as micro-manipulators, with minimal modifications.
The present study describes a task testing the ability of rats to trigger operant behavior by their relative spatial position to inaccessible rotating objects. Rats were placed in a Skinner box with a transparent front wall through which they could observe one or two adjacent objects fixed on a slowly rotating arena (d = 1 m) surrounded by an immobile black cylinder. The direction of arena rotation was alternated at a sequence of different time intervals. Rats were reinforced for the first bar-press that was emitted when a radius separating the two adjacent objects or dividing a single object into two halves (pointing radius) entered a 60 degrees sector of its circular trajectory defined with respect to the stationary Skinner box (reward sector). Well trained rats emitted 62.1 +/- 3.6% of responses in a 60 degrees sector preceding the reward sector and in the first 30 degrees of the reward sector. Response rate increased only when the pointing radius was approaching the reward sector, regardless of the time elapsed from the last reward. In the extinction session, when no reward was delivered, rats responded during the whole passage of the pointing radius through the former reward sector and spontaneously decreased responding after the pointing radius left this area. This finding suggests that rats perceived the reward sector as a continuous single region. The same results were obtained when the Skinner box with the rat was orbiting around the immobile scene. It is concluded that rats can recognize and anticipate their position relative to movable objects.