Main Menu (Mobile)- Block
- Our Research
-
Support Teams
- Overview
- Anatomy and Histology
- Cryo-Electron Microscopy
- Electron Microscopy
- Flow Cytometry
- Fly Facility
- Gene Targeting and Transgenics
- Janelia Experimental Technology
- Integrative Imaging
- Media Prep
- Molecular Genomics
- Primary & iPS Cell Culture
- Project Pipeline Support
- Project Technical Resources
- Quantitative Genomics
- Scientific Computing Software
- Scientific Computing Systems
- Viral Tools
- Vivarium
- Open Science
- You + Janelia
- About Us
Main Menu - Block
- Overview
- Anatomy and Histology
- Cryo-Electron Microscopy
- Electron Microscopy
- Flow Cytometry
- Fly Facility
- Gene Targeting and Transgenics
- Janelia Experimental Technology
- Integrative Imaging
- Media Prep
- Molecular Genomics
- Primary & iPS Cell Culture
- Project Pipeline Support
- Project Technical Resources
- Quantitative Genomics
- Scientific Computing Software
- Scientific Computing Systems
- Viral Tools
- Vivarium

Abstract
Theoretical neuroscientists often try to understand how the structure of a neural network relates to its function by focusing on structural features that would either follow from optimization or occur consistently across possible implementations. Both optimization theories and ensemble modeling approaches have repeatedly proven their worth, and it would simplify theory building considerably if predictions from both theory types could be derived and tested simultaneously. Here we show how tensor formalism from theoretical physics can be used to unify and solve many optimization and ensemble modeling approaches to predicting synaptic connectivity from neuronal responses. We specifically focus on analyzing the solution space of synaptic weights that allow a thresholdlinear neural network to respond in a prescribed way to a limited number of input conditions. For optimization purposes, we compute the synaptic weight vector that minimizes an arbitrary quadratic loss function. For ensemble modeling, we identify synaptic weight features that occur consistently across all solutions bounded by an arbitrary quadratic function. We derive a common solution to this suite of nonlinear problems by showing how each of them reduces to an equivalent linear problem that can be solved analytically. Although identifying the equivalent linear problem is nontrivial, our tensor formalism provides an elegant geometrical perspective that allows us to solve the problem numerically. The final algorithm is applicable to a wide range of interesting neuroscience problems, and the associated geometric insights may carry over to other scientific problems that require constrained optimization.
arXiv Preprint https://doi.org/10.48550/arXiv.2310.20309