Main Menu (Mobile)- Block

Main Menu - Block

Zuker Lab

janelia7_blocks-janelia7_fake_breadcrumb | block
Zuker Lab
node:field_slogan | entity_field
July 1, 2009 - February 1, 2015
node_title | node_title
Zuker Lab
node:field_slogan | entity_field
July 1, 2009 - February 1, 2015
node_body | node_body

Our long-term goal is to elucidate mechanisms used for signal transduction and information processing in sensory systems, and to understand how the senses create an internal representation of the outside world.

Currently, we are continuing our work on the periphery, but in addition we are moving our research into the brain to investigate how information from the tongue is mapped, decoded and transformed in the various brain taste centers.

Using a novel drosophila place learning assay, we are exploring visually guided navigation in an effort to better understand how an animal knows where it is, and where it is going.

The Biology of Mammalian Taste

We use the taste system as a model for our studies (chemosensation) as it provides a powerful platform to dissect the processing of sensory information, from detection at the periphery to perception in the brain. In addition, the sense of taste is exquisitely modulated by the internal state of the organism (hunger, satiety, expectation, emotion, etc.), and thus it serves as a rich model to explore multisensory integration.

Spatial Learning

In addition to being a Howard Hughes Medical Institute investigator and a faculty member in the Columbia University Biochemistry and Neuroscience departments, Dr. Zuker is a Senior Fellow at the Janelia Research Campus and oversees a small group that works on imaging and behavior.

Through a collaborative project with Michael Reiser, we are studying the place learning capabilities of the fruit fly. While many insects use visual landmarks to precisely locate their nest, prey, or foraging area, the extent to which flies use vision to navigate and remember specific locations has been unclear. Using a novel drosophila place learning assay and the molecular/genetic tools available in the fruit fly, we are exploring visually guided navigation in an effort to better understand how an animal knows where it is, and where it is going.