Main Menu (Mobile)- Block

Main Menu - Block

janelia7_blocks-janelia7_fake_breadcrumb | block
Koyama Lab / Publications
custom | custom

Filter

facetapi-Q2b17qCsTdECvJIqZJgYMaGsr8vANl1n | block
facetapi-W9JlIB1X0bjs93n1Alu3wHJQTTgDCBGe | block
facetapi-PV5lg7xuz68EAY8eakJzrcmwtdGEnxR0 | block
facetapi-021SKYQnqXW6ODq5W5dPAFEDBaEJubhN | block
general_search_page-panel_pane_1 | views_panes

48 Publications

Showing 31-40 of 48 results
Your Criteria:
    07/13/17 | Mapping the neural substrates of behavior.
    Robie AA, Hirokawa J, Edwards AW, Umayam LA, Lee A, Phillips ML, Card GM, Korff W, Rubin GM, Simpson JH, Reiser MB, Branson KM
    Cell. 2017-07-13;170(2):393-406. doi: 10.1016/j.cell.2017.06.032

    Assigning behavioral functions to neural structures has long been a central goal in neuroscience and is a necessary first step toward a circuit-level understanding of how the brain generates behavior. Here, we map the neural substrates of locomotion and social behaviors for Drosophila melanogaster using automated machine-vision and machine-learning techniques. From videos of 400,000 flies, we quantified the behavioral effects of activating 2,204 genetically targeted populations of neurons. We combined a novel quantification of anatomy with our behavioral analysis to create brain-behavior correlation maps, which are shared as browsable web pages and interactive software. Based on these maps, we generated hypotheses of regions of the brain causally related to sensory processing, locomotor control, courtship, aggression, and sleep. Our maps directly specify genetic tools to target these regions, which we used to identify a small population of neurons with a role in the control of walking.

    •We developed machine-vision methods to broadly and precisely quantify fly behavior•We measured effects of activating 2,204 genetically targeted neuronal populations•We created whole-brain maps of neural substrates of locomotor and social behaviors•We created resources for exploring our results and enabling further investigation

    Machine-vision analyses of large behavior and neuroanatomy data reveal whole-brain maps of regions associated with numerous complex behaviors.

    View Publication Page
    03/06/17 | Moonwalker descending neurons mediate visually evoked retreat in Drosophila.
    Sen R, Wu M, Branson K, Robie A, Rubin GM, Dickson BJ
    Current Biology : CB. 2017 Mar 6;27(5):766-71. doi: 10.1016/j.cub.2017.02.008

    Insects, like most animals, tend to steer away from imminent threats [1-7]. Drosophila melanogaster, for example, generally initiate an escape take-off in response to a looming visual stimulus, mimicking a potential predator [8]. The escape response to a visual threat is, however, flexible [9-12] and can alternatively consist of walking backward away from the perceived threat [11], which may be a more effective response to ambush predators such as nymphal praying mantids [7]. Flexibility in escape behavior may also add an element of unpredictability that makes it difficult for predators to anticipate or learn the prey's likely response [3-6]. Whereas the fly's escape jump has been well studied [8, 9, 13-18], the neuronal underpinnings of evasive walking remain largely unexplored. We previously reported the identification of a cluster of descending neurons-the moonwalker descending neurons (MDNs)-the activity of which is necessary and sufficient to trigger backward walking [19], as well as a population of visual projection neurons-the lobula columnar 16 (LC16) cells-that respond to looming visual stimuli and elicit backward walking and turning [11]. Given the similarity of their activation phenotypes, we hypothesized that LC16 neurons induce backward walking via MDNs and that turning while walking backward might reflect asymmetric activation of the left and right MDNs. Here, we present data from functional imaging, behavioral epistasis, and unilateral activation experiments that support these hypotheses. We conclude that LC16 and MDNs are critical components of the neural circuit that transduces threatening visual stimuli into directional locomotor output.

    View Publication Page
    12/14/18 | Motor cortex is an input-driven dynamical system controlling dexterous movement.
    Sauerbrei B, Guo J, Mischiati M, Guo W, Kabra M, Verma N, Branson KM, Hantman AW
    bioRxiv. 2018-12-14:266320. doi: 10.1101/266320

    Skillful control of movement is central to our ability to sense and manipulate the world. A large body of work in nonhuman primates has demonstrated that motor cortex provides flexible, time-varying activity patterns that control the arm during reaching and grasping. Previous studies have suggested that these patterns are generated by strong local recurrent dynamics operating autonomously from inputs during movement execution. An alternative possibility is that motor cortex requires coordination with upstream brain regions throughout the entire movement in order to yield these patterns. Here, we developed an experimental preparation in the mouse to directly test these possibilities using optogenetics and electrophysiology during a skilled reach-to-grab-to-eat task. To validate this preparation, we first established that a specific, time-varying pattern of motor cortical activity was required to produce coordinated movement. Next, in order to disentangle the contribution of local recurrent motor cortical dynamics from external input, we optogenetically held the recurrent contribution constant, then observed how motor cortical activity recovered following the end of this perturbation. Both the neural responses and hand trajectory varied from trial to trial, and this variability reflected variability in external inputs. To directly probe the role of these inputs, we used optogenetics to perturb activity in the thalamus. Thalamic perturbation at the start of the trial prevented movement initiation, and perturbation at any stage of the movement prevented progression of the hand to the target; this demonstrates that input is required throughout the movement. By comparing motor cortical activity with and without thalamic perturbation, we were able to estimate the effects of external inputs on motor cortical population activity. Thus, unlike pattern-generating circuits that are local and autonomous, such as those in the spinal cord that generate left-right alternation during locomotion, the pattern generator for reaching and grasping is distributed across multiple, strongly-interacting brain regions.

    View Publication Page
    03/20/24 | Motor neurons generate pose-targeted movements via proprioceptive sculpting.
    Gorko B, Siwanowicz I, Close K, Christoforou C, Hibbard KL, Kabra M, Lee A, Park J, Li SY, Chen AB, Namiki S, Chen C, Tuthill JC, Bock DD, Rouault H, Branson K, Ihrke G, Huston SJ
    Nature. 2024 Mar 20:. doi: 10.1038/s41586-024-07222-5

    Motor neurons are the final common pathway through which the brain controls movement of the body, forming the basic elements from which all movement is composed. Yet how a single motor neuron contributes to control during natural movement remains unclear. Here we anatomically and functionally characterize the individual roles of the motor neurons that control head movement in the fly, Drosophila melanogaster. Counterintuitively, we find that activity in a single motor neuron rotates the head in different directions, depending on the starting posture of the head, such that the head converges towards a pose determined by the identity of the stimulated motor neuron. A feedback model predicts that this convergent behaviour results from motor neuron drive interacting with proprioceptive feedback. We identify and genetically suppress a single class of proprioceptive neuron that changes the motor neuron-induced convergence as predicted by the feedback model. These data suggest a framework for how the brain controls movements: instead of directly generating movement in a given direction by activating a fixed set of motor neurons, the brain controls movements by adding bias to a continuing proprioceptive-motor loop.

    View Publication Page
    03/06/11 | Multi-camera real-time three-dimensional tracking of multiple flying animals.
    Straw AD, Branson K, Neumann TR, Dickinson MH
    Journal of the Royal Society, Interface. 2011 Mar 6;8(56):395-409. doi: 10.1098/rsif.2010.0230

    Automated tracking of animal movement allows analyses that would not otherwise be possible by providing great quantities of data. The additional capability of tracking in real time–with minimal latency–opens up the experimental possibility of manipulating sensory feedback, thus allowing detailed explorations of the neural basis for control of behaviour. Here, we describe a system capable of tracking the three-dimensional position and body orientation of animals such as flies and birds. The system operates with less than 40 ms latency and can track multiple animals simultaneously. To achieve these results, a multi-target tracking algorithm was developed based on the extended Kalman filter and the nearest neighbour standard filter data association algorithm. In one implementation, an 11-camera system is capable of tracking three flies simultaneously at 60 frames per second using a gigabit network of nine standard Intel Pentium 4 and Core 2 Duo computers. This manuscript presents the rationale and details of the algorithms employed and shows three implementations of the system. An experiment was performed using the tracking system to measure the effect of visual contrast on the flight speed of Drosophila melanogaster. At low contrasts, speed is more variable and faster on average than at high contrasts. Thus, the system is already a useful tool to study the neurobiology and behaviour of freely flying animals. If combined with other techniques, such as ’virtual reality’-type computer graphics or genetic manipulation, the tracking system would offer a powerful new way to investigate the biology of flying animals.

    View Publication Page
    08/20/18 | Multiple animals tracking in video using part affinity fields
    Rodriguez IF, Megret R, Egnor R, Branson K, Agosto JL, Giray T, Acuna E
    Visual observation and analysis of Vertebrate And Insect Behavior 2018. 2018 Aug 20:

    In this work, we address the problem of pose detection and tracking of multiple individuals for the study of behaviour in insects and animals. Using a Deep Neural Network architecture, precise detection and association of the body parts can be performed. The models are learned based on user-annotated training videos, which gives flexibility to the approach. This is illustrated on two different animals: honeybees and mice, where very good performance in part recognition and association are observed despite the presence of multiple interacting individuals.

    View Publication Page
    12/23/14 | Mushroom body output neurons encode valence and guide memory-based action selection in Drosophila.
    Aso Y, Sitaraman D, Ichinose T, Kaun KR, Vogt K, Belliart-Guérin G, Placais P, Robie AA, Yamagata N, Schnaitmann C, Rowell WJ, Johnston RM, Ngo TB, Chen N, Korff W, Nitabach MN, Heberlein U, Preat T, Branson KM, Tanimoto H, Rubin GM
    eLife. 12/2014;4:. doi: 10.7554/eLife.04580

    Animals discriminate stimuli, learn their predictive value and use this knowledge to modify their behavior. In Drosophila, the mushroom body (MB) plays a key role in these processes. Sensory stimuli are sparsely represented by ∼2000 Kenyon cells, which converge onto 34 output neurons (MBONs) of 21 types. We studied the role of MBONs in several associative learning tasks and in sleep regulation, revealing the extent to which information flow is segregated into distinct channels and suggesting possible roles for the multi-layered MBON network. We also show that optogenetic activation of MBONs can, depending on cell type, induce repulsion or attraction in flies. The behavioral effects of MBON perturbation are combinatorial, suggesting that the MBON ensemble collectively represents valence. We propose that local, stimulus-specific dopaminergic modulation selectively alters the balance within the MBON network for those stimuli. Our results suggest that valence encoded by the MBON ensemble biases memory-based action selection.

    View Publication Page
    11/02/17 | Network-size independent covering number bounds for deep networks.
    Kabra M, Branson KM
    arXiv. 2017 Nov 02:arXiv:1711.00753

    We give a covering number bound for deep learning networks that is independent of the size of the network. The key for the simple analysis is that for linear classifiers, rotating the data doesn't affect the covering number. Thus, we can ignore the rotation part of each layer's linear transformation, and get the covering number bound by concentrating on the scaling part.

    View Publication Page
    12/07/15 | Sample complexity of learning Mahalanobis distance metrics.
    Verma N, Branson KM
    Neural Information Processing Systems Conference. 2015-Jul ;28:

    Metric learning seeks a transformation of the feature space that enhances prediction quality for a given task. In this work we provide PAC-style sample complexity rates for supervised metric learning. We give matching lower- and upper-bounds showing that sample complexity scales with the representation dimension when no assumptions are made about the underlying data distribution. In addition, by leveraging the structure of the data distribution, we provide rates fine-tuned to a specific notion of the intrinsic complexity of a given dataset, allowing us to relax the dependence on representation dimension. We show both theoretically and empirically that augmenting the metric learning optimization criterion with a simple norm-based regularization is important and can help adapt to a dataset’s intrinsic complexity yielding better generalization, thus partly explaining the empirical success of similar regularizations reported in previous works.

    View Publication Page
    06/22/23 | Small-field visual projection neurons detect translational optic flow and support walking control
    Mathew D. Isaacson , Jessica L. M. Eliason , Aljoscha Nern , Edward M. Rogers , Gus K. Lott , Tanya Tabachnik , William J. Rowell , Austin W. Edwards , Wyatt L. Korff , Gerald M. Rubin , Kristin Branson , Michael B. Reiser
    bioRxiv. 2023 Jun 22:. doi: 10.1101/2023.06.21.546024

    Animals rely on visual motion for navigating the world, and research in flies has clarified how neural circuits extract information from moving visual scenes. However, the major pathways connecting these patterns of optic flow to behavior remain poorly understood. Using a high-throughput quantitative assay of visually guided behaviors and genetic neuronal silencing, we discovered a region in Drosophila’s protocerebrum critical for visual motion following. We used neuronal silencing, calcium imaging, and optogenetics to identify a single cell type, LPC1, that innervates this region, detects translational optic flow, and plays a key role in regulating forward walking. Moreover, the population of LPC1s can estimate the travelling direction, such as when gaze direction diverges from body heading. By linking specific cell types and their visual computations to specific behaviors, our findings establish a foundation for understanding how the nervous system uses vision to guide navigation.

    View Publication Page