Filter
Associated Lab
- Aguilera Castrejon Lab (16) Apply Aguilera Castrejon Lab filter
- Ahrens Lab (63) Apply Ahrens Lab filter
- Aso Lab (40) Apply Aso Lab filter
- Baker Lab (38) Apply Baker Lab filter
- Betzig Lab (112) Apply Betzig Lab filter
- Beyene Lab (13) Apply Beyene Lab filter
- Bock Lab (17) Apply Bock Lab filter
- Branson Lab (52) Apply Branson Lab filter
- Card Lab (42) Apply Card Lab filter
- Cardona Lab (63) Apply Cardona Lab filter
- Chklovskii Lab (13) Apply Chklovskii Lab filter
- Clapham Lab (14) Apply Clapham Lab filter
- Cui Lab (19) Apply Cui Lab filter
- Darshan Lab (12) Apply Darshan Lab filter
- Dennis Lab (1) Apply Dennis Lab filter
- Dickson Lab (46) Apply Dickson Lab filter
- Druckmann Lab (25) Apply Druckmann Lab filter
- Dudman Lab (50) Apply Dudman Lab filter
- Eddy/Rivas Lab (30) Apply Eddy/Rivas Lab filter
- Egnor Lab (11) Apply Egnor Lab filter
- Espinosa Medina Lab (19) Apply Espinosa Medina Lab filter
- Feliciano Lab (7) Apply Feliciano Lab filter
- Fetter Lab (41) Apply Fetter Lab filter
- Fitzgerald Lab (29) Apply Fitzgerald Lab filter
- Freeman Lab (15) Apply Freeman Lab filter
- Funke Lab (38) Apply Funke Lab filter
- Gonen Lab (91) Apply Gonen Lab filter
- Grigorieff Lab (62) Apply Grigorieff Lab filter
- Harris Lab (60) Apply Harris Lab filter
- Heberlein Lab (94) Apply Heberlein Lab filter
- Hermundstad Lab (26) Apply Hermundstad Lab filter
- Hess Lab (77) Apply Hess Lab filter
- Ilanges Lab (2) Apply Ilanges Lab filter
- Jayaraman Lab (46) Apply Jayaraman Lab filter
- Ji Lab (33) Apply Ji Lab filter
- Johnson Lab (6) Apply Johnson Lab filter
- Kainmueller Lab (19) Apply Kainmueller Lab filter
- Karpova Lab (14) Apply Karpova Lab filter
- Keleman Lab (13) Apply Keleman Lab filter
- Keller Lab (76) Apply Keller Lab filter
- Koay Lab (18) Apply Koay Lab filter
- Lavis Lab (148) Apply Lavis Lab filter
- Lee (Albert) Lab (34) Apply Lee (Albert) Lab filter
- Leonardo Lab (23) Apply Leonardo Lab filter
- Li Lab (28) Apply Li Lab filter
- Lippincott-Schwartz Lab (168) Apply Lippincott-Schwartz Lab filter
- Liu (Yin) Lab (6) Apply Liu (Yin) Lab filter
- Liu (Zhe) Lab (61) Apply Liu (Zhe) Lab filter
- Looger Lab (138) Apply Looger Lab filter
- Magee Lab (49) Apply Magee Lab filter
- Menon Lab (18) Apply Menon Lab filter
- Murphy Lab (13) Apply Murphy Lab filter
- O'Shea Lab (6) Apply O'Shea Lab filter
- Otopalik Lab (13) Apply Otopalik Lab filter
- Pachitariu Lab (47) Apply Pachitariu Lab filter
- Pastalkova Lab (18) Apply Pastalkova Lab filter
- Pavlopoulos Lab (19) Apply Pavlopoulos Lab filter
- Pedram Lab (15) Apply Pedram Lab filter
- Podgorski Lab (16) Apply Podgorski Lab filter
- Reiser Lab (51) Apply Reiser Lab filter
- Riddiford Lab (44) Apply Riddiford Lab filter
- Romani Lab (43) Apply Romani Lab filter
- Rubin Lab (143) Apply Rubin Lab filter
- Saalfeld Lab (63) Apply Saalfeld Lab filter
- Satou Lab (16) Apply Satou Lab filter
- Scheffer Lab (36) Apply Scheffer Lab filter
- Schreiter Lab (67) Apply Schreiter Lab filter
- Sgro Lab (21) Apply Sgro Lab filter
- Shroff Lab (31) Apply Shroff Lab filter
- Simpson Lab (23) Apply Simpson Lab filter
- Singer Lab (80) Apply Singer Lab filter
- Spruston Lab (93) Apply Spruston Lab filter
- Stern Lab (156) Apply Stern Lab filter
- Sternson Lab (54) Apply Sternson Lab filter
- Stringer Lab (35) Apply Stringer Lab filter
- Svoboda Lab (135) Apply Svoboda Lab filter
- Tebo Lab (33) Apply Tebo Lab filter
- Tervo Lab (9) Apply Tervo Lab filter
- Tillberg Lab (21) Apply Tillberg Lab filter
- Tjian Lab (64) Apply Tjian Lab filter
- Truman Lab (88) Apply Truman Lab filter
- Turaga Lab (51) Apply Turaga Lab filter
- Turner Lab (37) Apply Turner Lab filter
- Vale Lab (7) Apply Vale Lab filter
- Voigts Lab (3) Apply Voigts Lab filter
- Wang (Meng) Lab (19) Apply Wang (Meng) Lab filter
- Wang (Shaohe) Lab (25) Apply Wang (Shaohe) Lab filter
- Wu Lab (9) Apply Wu Lab filter
- Zlatic Lab (28) Apply Zlatic Lab filter
- Zuker Lab (25) Apply Zuker Lab filter
Associated Project Team
- CellMap (12) Apply CellMap filter
- COSEM (3) Apply COSEM filter
- FIB-SEM Technology (2) Apply FIB-SEM Technology filter
- Fly Descending Interneuron (11) Apply Fly Descending Interneuron filter
- Fly Functional Connectome (14) Apply Fly Functional Connectome filter
- Fly Olympiad (5) Apply Fly Olympiad filter
- FlyEM (53) Apply FlyEM filter
- FlyLight (49) Apply FlyLight filter
- GENIE (45) Apply GENIE filter
- Integrative Imaging (3) Apply Integrative Imaging filter
- Larval Olympiad (2) Apply Larval Olympiad filter
- MouseLight (18) Apply MouseLight filter
- NeuroSeq (1) Apply NeuroSeq filter
- ThalamoSeq (1) Apply ThalamoSeq filter
- Tool Translation Team (T3) (26) Apply Tool Translation Team (T3) filter
- Transcription Imaging (49) Apply Transcription Imaging filter
Publication Date
- 2025 (99) Apply 2025 filter
- 2024 (220) Apply 2024 filter
- 2023 (160) Apply 2023 filter
- 2022 (193) Apply 2022 filter
- 2021 (194) Apply 2021 filter
- 2020 (196) Apply 2020 filter
- 2019 (202) Apply 2019 filter
- 2018 (232) Apply 2018 filter
- 2017 (217) Apply 2017 filter
- 2016 (209) Apply 2016 filter
- 2015 (252) Apply 2015 filter
- 2014 (236) Apply 2014 filter
- 2013 (194) Apply 2013 filter
- 2012 (190) Apply 2012 filter
- 2011 (190) Apply 2011 filter
- 2010 (161) Apply 2010 filter
- 2009 (158) Apply 2009 filter
- 2008 (140) Apply 2008 filter
- 2007 (106) Apply 2007 filter
- 2006 (92) Apply 2006 filter
- 2005 (67) Apply 2005 filter
- 2004 (57) Apply 2004 filter
- 2003 (58) Apply 2003 filter
- 2002 (39) Apply 2002 filter
- 2001 (28) Apply 2001 filter
- 2000 (29) Apply 2000 filter
- 1999 (14) Apply 1999 filter
- 1998 (18) Apply 1998 filter
- 1997 (16) Apply 1997 filter
- 1996 (10) Apply 1996 filter
- 1995 (18) Apply 1995 filter
- 1994 (12) Apply 1994 filter
- 1993 (10) Apply 1993 filter
- 1992 (6) Apply 1992 filter
- 1991 (11) Apply 1991 filter
- 1990 (11) Apply 1990 filter
- 1989 (6) Apply 1989 filter
- 1988 (1) Apply 1988 filter
- 1987 (7) Apply 1987 filter
- 1986 (4) Apply 1986 filter
- 1985 (5) Apply 1985 filter
- 1984 (2) Apply 1984 filter
- 1983 (2) Apply 1983 filter
- 1982 (3) Apply 1982 filter
- 1981 (3) Apply 1981 filter
- 1980 (1) Apply 1980 filter
- 1979 (1) Apply 1979 filter
- 1976 (2) Apply 1976 filter
- 1973 (1) Apply 1973 filter
- 1970 (1) Apply 1970 filter
- 1967 (1) Apply 1967 filter
Type of Publication
4085 Publications
Showing 2531-2540 of 4085 resultsChoosing a mate is one of the most consequential decisions a female will make during her lifetime. A female fly signals her willingness to mate by opening her vaginal plates, allowing a courting male to copulate. Vaginal plate opening (VPO) occurs in response to the male courtship song and is dependent on the mating status of the female. How these exteroceptive (song) and interoceptive (mating status) inputs are integrated to regulate VPO remains unknown. Here we characterize the neural circuitry that implements mating decisions in the brain of female Drosophila melanogaster. We show that VPO is controlled by a pair of female-specific descending neurons (vpoDNs). The vpoDNs receive excitatory input from auditory neurons (vpoENs), which are tuned to specific features of the D. melanogaster song, and from pC1 neurons, which encode the mating status of the female. The song responses of vpoDNs, but not vpoENs, are attenuated upon mating, accounting for the reduced receptivity of mated females. This modulation is mediated by pC1 neurons. The vpoDNs thus directly integrate the external and internal signals that control the mating decisions of Drosophila females.
Dysfunctional sociability is a core symptom in autism spectrum disorder (ASD) that may arise from neural-network dysconnectivity between multiple brain regions. However, pathogenic neural-network mechanisms underlying social dysfunction are largely unknown. Here, we demonstrate that circuit-selective mutation (ctMUT) of ASD-risk Shank3 gene within a unidirectional projection from the prefrontal cortex to the basolateral amygdala alters spine morphology and excitatory-inhibitory balance of the circuit. Shank3 ctMUT mice show reduced sociability as well as elevated neural activity and its amplitude variability, which is consistent with the neuroimaging results from human ASD patients. Moreover, the circuit hyper-activity disrupts the temporal correlation of socially tuned neurons to the events of social interactions. Finally, optogenetic circuit activation in wild-type mice partially recapitulates the reduced sociability of Shank3 ctMUT mice, while circuit inhibition in Shank3 ctMUT mice partially rescues social behavior. Collectively, these results highlight a circuit-level pathogenic mechanism of Shank3 mutation that drives social dysfunction.
When navigating in their environment, animals use visual motion cues as feedback signals that are elicited by their own motion. Such signals are provided by wide-field neurons sampling motion directions at multiple image points as the animal maneuvers. Each one of these neurons responds selectively to a specific optic flow-field representing the spatial distribution of motion vectors on the retina. Here, we describe the discovery of a group of local, inhibitory interneurons in the fruit fly Drosophila key for filtering these cues. Using anatomy, molecular characterization, activity manipulation, and physiological recordings, we demonstrate that these interneurons convey direction-selective inhibition to wide-field neurons with opposite preferred direction and provide evidence for how their connectivity enables the computation required for integrating opposing motions. Our results indicate that, rather than sharpening directional selectivity per se, these circuit elements reduce noise by eliminating non-specific responses to complex visual information.
•Discovery of bi-stratified glutamatergic lobula plate-intrinsic (LPi) interneurons•LPi neurons provide visual null direction inhibition to wide-field tangential cells•Blocking LPi activity leads to target neurons responding to inadequate motion cues•Motion opponency thus increases flow-field selectivity
Newly identified inhibitory neurons are central to an integrative circuit that enables Drosophila to process visual cues with opposite motions generated during flight. The neurons are required to discriminate between distinct complex motion patterns, indicating that neural processing of opposing cues can yield outcomes beyond the simple sum of two inputs.
Mating and egg laying are tightly cooordinated events in the reproductive life of all oviparous females. Oviposition is typically rare in virgin females but is initiated after copulation. Here we identify the neural circuitry that links egg laying to mating status in Drosophila melanogaster. Activation of female-specific oviposition descending neurons (oviDNs) is necessary and sufficient for egg laying, and is equally potent in virgin and mated females. After mating, sex peptide-a protein from the male seminal fluid-triggers many behavioural and physiological changes in the female, including the onset of egg laying. Sex peptide is detected by sensory neurons in the uterus, and silences these neurons and their postsynaptic ascending neurons in the abdominal ganglion. We show that these abdominal ganglion neurons directly activate the female-specific pC1 neurons. GABAergic (γ-aminobutyric-acid-releasing) oviposition inhibitory neurons (oviINs) mediate feed-forward inhibition from pC1 neurons to both oviDNs and their major excitatory input, the oviposition excitatory neurons (oviENs). By attenuating the abdominal ganglion inputs to pC1 neurons and oviINs, sex peptide disinhibits oviDNs to enable egg laying after mating. This circuitry thus coordinates the two key events in female reproduction: mating and egg laying.
How does an organism’s internal state direct its actions? At one moment an animal forages for food with acrobatic feats such as tree climbing and jumping between branches. At another time, it travels along the ground to find water or a mate, exposing itself to predators along the way. These behaviors are costly in terms of energy or physical risk, and the likelihood of performing one set of actions relative to another is strongly modulated by internal state. For example, an animal in energy deficit searches for food and a dehydrated animal looks for water. The crosstalk between physiological state and motivational processes influences behavioral intensity and intent, but the underlying neural circuits are poorly understood. Molecular genetics along with optogenetic and pharmacogenetic tools for perturbing neuron function have enabled cell type-selective dissection of circuits that mediate behavioral responses to physiological state changes. Here, we review recent progress into neural circuit analysis of hunger in the mouse by focusing on a starvation-sensitive neuron population in the hypothalamus that is sufficient to promote voracious eating. We also consider research into the motivational processes that are thought to underlie hunger in order to outline considerations for bridging the gap between homeostatic and motivational neural circuits.
Startle behaviors are rapid, high-performance motor responses to threatening stimuli. Startle responses have been identified in a broad range of species across animal diversity. For investigations of neural circuit structure and function, these behaviors offer a number of benefits, including that they are driven by large and identifiable neurons and their neural control is simple in comparison to other behaviors. Among vertebrates, the best-known startle circuit is the Mauthner cell circuit of fishes. In recent years, genetic approaches in zebrafish have provided key tools for morphological and physiological dissection of circuits and greatly extended understanding of their architecture. Here we discuss the startle circuit of fishes, with a focus on the Mauthner cells and associated interneurons called spiral fiber neurons and we add new observations on hindbrain circuit organization. We also briefly review and compare startle circuits of several other taxa, paying particular attention to how movement direction is controlled.
Escape behaviors deliver organisms away from imminent catastrophe. Here, we characterize behavioral responses of freely swimming larval zebrafish to looming visual stimuli simulating predators. We report that the visual system alone can recruit lateralized, rapid escape motor programs, similar to those elicited by mechanosensory modalities. Two-photon calcium imaging of retino-recipient midbrain regions isolated the optic tectum as an important center processing looming stimuli, with ensemble activity encoding the critical image size determining escape latency. Furthermore, we describe activity in retinal ganglion cell terminals and superficial inhibitory interneurons in the tectum during looming and propose a model for how temporal dynamics in tectal periventricular neurons might arise from computations between these two fundamental constituents. Finally, laser ablations of hindbrain circuitry confirmed that visual and mechanosensory modalities share the same premotor output network. We establish a circuit for the processing of aversive stimuli in the context of an innate visual behavior.
Active sensation requires the convergence of external stimuli with representations of body movements. We used mouse behavior, electrophysiology and optogenetics to dissect the temporal interactions among whisker movement, neural activity and sensation of touch. We photostimulated layer 4 activity in single barrels in a closed loop with whisking. Mimicking touch-related neural activity caused illusory perception of an object at a particular location, but scrambling the timing of the spikes over one whisking cycle (tens of milliseconds) did not abolish the illusion, indicating that knowledge of instantaneous whisker position is unnecessary for discriminating object locations. The illusions were induced only during bouts of directed whisking, when mice expected touch, and in the relevant barrel. Reducing activity biased behavior, consistent with a spike count code for object detection at a particular location. Our results show that mice integrate coding of touch with movement over timescales of a whisking bout to produce perception of active touch.
Successful neurological rehabilitation depends on accurate diagnosis, effective treatment, and quantitative evaluation. Neural coding, a technology for interpretation of functional and structural information of the nervous system, has contributed to the advancements in neuroimaging, brain-machine interface (BMI), and design of training devices for rehabilitation purposes. In this review, we summarized the latest breakthroughs in neuroimaging from microscale to macroscale levels with potential diagnostic applications for rehabilitation. We also reviewed the achievements in electrocorticography (ECoG) coding with both animal models and human beings for BMI design, electromyography (EMG) interpretation for interaction with external robotic systems, and robot-assisted quantitative evaluation on the progress of rehabilitation programs. Future rehabilitation would be more home-based, automatic, and self-served by patients. Further investigations and breakthroughs are mainly needed in aspects of improving the computational efficiency in neuroimaging and multichannel ECoG by selection of localized neuroinformatics, validation of the effectiveness in BMI guided rehabilitation programs, and simplification of the system operation in training devices.
Animals seek out relevant information by moving through a dynamic world, but sensory systems are usually studied under highly constrained and passive conditions that may not probe important dimensions of the neural code. Here, we explored neural coding in the barrel cortex of head-fixed mice that tracked walls with their whiskers in tactile virtual reality. Optogenetic manipulations revealed that barrel cortex plays a role in wall-tracking. Closed-loop optogenetic control of layer 4 neurons can substitute for whisker-object contact to guide behavior resembling wall tracking. We measured neural activity using two-photon calcium imaging and extracellular recordings. Neurons were tuned to the distance between the animal snout and the contralateral wall, with monotonic, unimodal, and multimodal tuning curves. This rich representation of object location in the barrel cortex could not be predicted based on simple stimulus-response relationships involving individual whiskers and likely emerges within cortical circuits.