Filter
Associated Lab
- Ahrens Lab (53) Apply Ahrens Lab filter
- Aso Lab (1) Apply Aso Lab filter
- Branson Lab (1) Apply Branson Lab filter
- Fitzgerald Lab (1) Apply Fitzgerald Lab filter
- Freeman Lab (5) Apply Freeman Lab filter
- Harris Lab (2) Apply Harris Lab filter
- Jayaraman Lab (2) Apply Jayaraman Lab filter
- Johnson Lab (1) Apply Johnson Lab filter
- Keller Lab (5) Apply Keller Lab filter
- Lavis Lab (1) Apply Lavis Lab filter
- Liu (Zhe) Lab (1) Apply Liu (Zhe) Lab filter
- Looger Lab (7) Apply Looger Lab filter
- Podgorski Lab (3) Apply Podgorski Lab filter
- Schreiter Lab (4) Apply Schreiter Lab filter
- Svoboda Lab (4) Apply Svoboda Lab filter
- Turaga Lab (2) Apply Turaga Lab filter
- Turner Lab (2) Apply Turner Lab filter
- Zlatic Lab (1) Apply Zlatic Lab filter
Associated Project Team
Publication Date
- 2024 (2) Apply 2024 filter
- 2023 (4) Apply 2023 filter
- 2022 (4) Apply 2022 filter
- 2021 (2) Apply 2021 filter
- 2020 (4) Apply 2020 filter
- 2019 (5) Apply 2019 filter
- 2018 (4) Apply 2018 filter
- 2017 (2) Apply 2017 filter
- 2016 (7) Apply 2016 filter
- 2015 (3) Apply 2015 filter
- 2014 (3) Apply 2014 filter
- 2013 (5) Apply 2013 filter
- 2012 (1) Apply 2012 filter
- 2011 (1) Apply 2011 filter
- 2010 (1) Apply 2010 filter
- 2008 (3) Apply 2008 filter
- 2006 (2) Apply 2006 filter
Type of Publication
53 Publications
Showing 51-53 of 53 resultsWe describe a class of models that predict how the instantaneous firing rate of a neuron depends on a dynamic stimulus. The models utilize a learnt pointwise nonlinear transform of the stimulus, followed by a linear filter that acts on the sequence of transformed inputs. In one case, the nonlinear transform is the same at all filter lag-times. Thus, this "input nonlinearity" converts the initial numerical representation of stimulus value to a new representation that provides optimal input to the subsequent linear model. We describe algorithms that estimate both the input nonlinearity and the linear weights simultaneously; and present techniques to regularise and quantify uncertainty in the estimates. In a second approach, the model is generalized to allow a different nonlinear transform of the stimulus value at each lag-time. Although more general, this model is algorithmically more straightforward to fit. However, it has many more degrees of freedom than the first approach, thus requiring more data for accurate estimation. We test the feasibility of these methods on synthetic data, and on responses from a neuron in rodent barrel cortex. The models are shown to predict responses to novel data accurately, and to recover several important neuronal response properties.
Biophysically accurate multicompartmental models of individual neurons have significantly advanced our understanding of the input-output function of single cells. These models depend on a large number of parameters that are difficult to estimate. In practice, they are often hand-tuned to match measured physiological behaviors, thus raising questions of identifiability and interpretability. We propose a statistical approach to the automatic estimation of various biologically relevant parameters, including 1) the distribution of channel densities, 2) the spatiotemporal pattern of synaptic input, and 3) axial resistances across extended dendrites. Recent experimental advances, notably in voltage-sensitive imaging, motivate us to assume access to: i) the spatiotemporal voltage signal in the dendrite and ii) an approximate description of the channel kinetics of interest. We show here that, given i and ii, parameters 1-3 can be inferred simultaneously by nonnegative linear regression; that this optimization problem possesses a unique solution and is guaranteed to converge despite the large number of parameters and their complex nonlinear interaction; and that standard optimization algorithms efficiently reach this optimum with modest computational and data requirements. We demonstrate that the method leads to accurate estimations on a wide variety of challenging model data sets that include up to about 10(4) parameters (roughly two orders of magnitude more than previously feasible) and describe how the method gives insights into the functional interaction of groups of channels.
Our understanding of the input-output function of single cells has been substantially advanced by biophysically accurate multi-compartmental models. The large number of parameters needing hand tuning in these models has, however, somewhat hampered their applicability and interpretability. Here we propose a simple and well-founded method for automatic estimation of many of these key parameters: 1) the spatial distribution of channel densities on the cell’s membrane; 2) the spatiotemporal pattern of synaptic input; 3) the channels’ reversal potentials; 4) the intercompartmental conductances; and 5) the noise level in each compartment. We assume experimental access to: a) the spatiotemporal voltage signal in the dendrite (or some contiguous subpart thereof, e.g. via voltage sensitive imaging techniques), b) an approximate kinetic description of the channels and synapses present in each compartment, and c) the morphology of the part of the neuron under investigation. The key observation is that, given data a)-c), all of the parameters 1)-4) may be simultaneously inferred by a version of constrained linear regression; this regression, in turn, is efficiently solved using standard algorithms, without any “local minima” problems despite the large number of parameters and complex dynamics. The noise level 5) may also be estimated by standard techniques. We demonstrate the method’s accuracy on several model datasets, and describe techniques for quantifying the uncertainty in our estimates.