Main Menu (Mobile)- Block

Main Menu - Block

janelia7_blocks-janelia7_secondary_menu | block
More in this page
janelia7_blocks-janelia7_fake_breadcrumb | block
Hermundstad Lab / Publications
general_search_page-panel_pane_1 | views_panes

5 Publications

Showing 1-5 of 5 results
06/12/17 | Neural signatures of dynamic stimulus selection in Drosophila.
Sun Y, Nern A, Franconville R, Dana H, Schreiter ER, Looger LL, Svoboda K, Kim DS, Hermundstad AM, Jayaraman V
Nature Neuroscience. 2017 Jun 12;20(8):1104-13. doi: 10.1038/nn.4581

Many animals orient using visual cues, but how a single cue is selected from among many is poorly understood. Here we show that Drosophila ring neurons—central brain neurons implicated in navigation—display visual stimulus selection. Using in vivo two-color two-photon imaging with genetically encoded calcium indicators, we demonstrate that individual ring neurons inherit simple-cell-like receptive fields from their upstream partners. Stimuli in the contralateral visual field suppressed responses to ipsilateral stimuli in both populations. Suppression strength depended on when and where the contralateral stimulus was presented, an effect stronger in ring neurons than in their upstream inputs. This history-dependent effect on the temporal structure of visual responses, which was well modeled by a simple biphasic filter, may determine how visual references are selected for the fly's internal compass. Our approach highlights how two-color calcium imaging can help identify and localize the origins of sensory transformations across synaptically connected neural populations.

View Publication Page
11/14/14 | Variance predicts salience in central sensory processing.
Hermundstad AM, Briguglio JJ, Conte MM, Victor JD, Balasubramanian V, Tkačik G
eLife. 2014;3:. doi: 10.7554/eLife.03722

Information processing in the sensory periphery is shaped by natural stimulus statistics. In the periphery, a transmission bottleneck constrains performance; thus efficient coding implies that natural signal components with a predictably wider range should be compressed. In a different regime--when sampling limitations constrain performance--efficient coding implies that more resources should be allocated to informative features that are more variable. We propose that this regime is relevant for sensory cortex when it extracts complex features from limited numbers of sensory samples. To test this prediction, we use central visual processing as a model: we show that visual sensitivity for local multi-point spatial correlations, described by dozens of independently-measured parameters, can be quantitatively predicted from the structure of natural images. This suggests that efficient coding applies centrally, where it extends to higher-order sensory features and operates in a regime in which sensitivity increases with feature variability.

View Publication Page
05/15/14 | Structurally-constrained relationships between cognitive states in the human brain.
Hermundstad AM, Brown KS, Bassett DS, Aminoff EM, Frithsen A, Johnson A, Tipper CM, Miller MB, Grafton ST, Carlson JM
PLoS computational biology. 2014 May;10(5):e1003591. doi: 10.1371/journal.pcbi.1003591

The anatomical connectivity of the human brain supports diverse patterns of correlated neural activity that are thought to underlie cognitive function. In a manner sensitive to underlying structural brain architecture, we examine the extent to which such patterns of correlated activity systematically vary across cognitive states. Anatomical white matter connectivity is compared with functional correlations in neural activity measured via blood oxygen level dependent (BOLD) signals. Functional connectivity is separately measured at rest, during an attention task, and during a memory task. We assess these structural and functional measures within previously-identified resting-state functional networks, denoted task-positive and task-negative networks, that have been independently shown to be strongly anticorrelated at rest but also involve regions of the brain that routinely increase and decrease in activity during task-driven processes. We find that the density of anatomical connections within and between task-positive and task-negative networks is differentially related to strong, task-dependent correlations in neural activity. The space mapped out by the observed structure-function relationships is used to define a quantitative measure of separation between resting, attention, and memory states. We find that the degree of separation between states is related to both general measures of behavioral performance and relative differences in task-specific measures of attention versus memory performance. These findings suggest that the observed separation between cognitive states reflects underlying organizational principles of human brain structure and function.

View Publication Page
04/09/13 | Structural foundations of resting-state and task-based functional connectivity in the human brain.
Hermundstad AM, Bassett DS, Brown KS, Aminoff EM, Clewett D, Freeman S, Frithsen A, Johnson A, Tipper CM, Miller MB, Grafton ST, Carlson JM
Proceedings of the National Academy of Sciences of the United States of America. 2013 Apr 9;110(15):6169-74. doi: 10.1073/pnas.1219562110

Magnetic resonance imaging enables the noninvasive mapping of both anatomical white matter connectivity and dynamic patterns of neural activity in the human brain. We examine the relationship between the structural properties of white matter streamlines (structural connectivity) and the functional properties of correlations in neural activity (functional connectivity) within 84 healthy human subjects both at rest and during the performance of attention- and memory-demanding tasks. We show that structural properties, including the length, number, and spatial location of white matter streamlines, are indicative of and can be inferred from the strength of resting-state and task-based functional correlations between brain regions. These results, which are both representative of the entire set of subjects and consistently observed within individual subjects, uncover robust links between structural and functional connectivity in the human brain.

View Publication Page
06/30/11 | Learning, memory, and the role of neural network architecture.
Hermundstad AM, Brown KS, Bassett DS, Carlson JM
PLoS computational biology. 2011 Jun;7(6):e1002063. doi: 10.1371/journal.pcbi.1002063

The performance of information processing systems, from artificial neural networks to natural neuronal ensembles, depends heavily on the underlying system architecture. In this study, we compare the performance of parallel and layered network architectures during sequential tasks that require both acquisition and retention of information, thereby identifying tradeoffs between learning and memory processes. During the task of supervised, sequential function approximation, networks produce and adapt representations of external information. Performance is evaluated by statistically analyzing the error in these representations while varying the initial network state, the structure of the external information, and the time given to learn the information. We link performance to complexity in network architecture by characterizing local error landscape curvature. We find that variations in error landscape structure give rise to tradeoffs in performance; these include the ability of the network to maximize accuracy versus minimize inaccuracy and produce specific versus generalizable representations of information. Parallel networks generate smooth error landscapes with deep, narrow minima, enabling them to find highly specific representations given sufficient time. While accurate, however, these representations are difficult to generalize. In contrast, layered networks generate rough error landscapes with a variety of local minima, allowing them to quickly find coarse representations. Although less accurate, these representations are easily adaptable. The presence of measurable performance tradeoffs in both layered and parallel networks has implications for understanding the behavior of a wide variety of natural and artificial learning systems.

View Publication Page