Main Menu (Mobile)- Block

Main Menu - Block

Noisy solo neurons show consistency in groups

custom | custom
custom | custom

04/14/21 | Noisy solo neurons show consistency in groups

node:body | entity_field

Measurements of neural populations reveal that mouse neurons can discriminate between subtle differences in visual input—even if the mice don’t appear to notice.


Individual neurons are unreliable — but they gain precision by working together.

Ping the same brain cell multiple times, and it’ll respond differently each time. A neuron that fires in response to seeing a cow might not fire again when that cow reappears five minutes later. And yet, despite this variation, neurons in the visual cortex manage to create meaningful representations of every object an animal might encounter during its life.

Neurons’ strength comes in numbers: Pooled across large populations, groups of neurons can represent objects with remarkable consistency, new research shows. Sets of neurons in the mouse visual cortex can discriminate between miniscule differences in the rotation of an object, researchers from the Howard Hughes Medical Institute’s Janelia Research Campus report April 14 in the journal Cell. Those distinctions are a hundred times smaller than what mice appear to be able to discriminate in behavioral experiments.

The discrepancy suggests that apparent deficits in visual acuity are due to processing limitations in other parts of the brain, not the data that the brain is collecting about the world, says Janelia group leader Carsen Stringer, who co-led the project with group leader Marius Pachitariu. “Mice aren’t information-limited—as much information as possible about the visual world is getting into the visual cortex,” Stringer says.

Pachitariu and Stringer used a specialized microscope to simultaneously record the activity of up to 50,000 neurons in the mouse visual cortex. They showed mice black and white striped images randomly rotated to different angles, and tracked how neurons reacted.

As predicted, the responses of individual neurons varied wildly between trials. But using a series of computational analyses, Stringer and Pachitariu found that the neurons pooled into consistent groups with much more even-keeled responses. These averaged-out “super neurons” responded consistently to the same image shown multiple times, and could discriminate between image rotations as small as 0.3 degrees.

It was a somewhat surprising finding, Pachitariu says: In behavioral tests, mice don’t do nearly as well as the performance of these super-neurons suggests they could. In other words, the visual system is detecting very fine visual differences, but that message is getting muddled somewhere else in the brain.

“We tend to think that if your neurons know what the correct stimulus is, then you should know too,” Pachitariu says. “But that's just simply not true!”

The data were first shared publicly in 2019, and were repurposed for Neuromatch Academy, an intensive computational neuroscience summer school that Stringer helped to develop.

The findings run counter to several other recent studies, which found a closer alignment between the detail that visual neurons can perceive and how mice actually behave. Pachitariu suspects the contradiction might come down to methodological differences — these other studies recorded from smaller populations of neurons. “If you record from more neurons and use more stimuli, you can measure finer differences,” Pachitariu says.

It’s still not clear what the value of all this neural noise is for mice. Neurons are some of the most energy-intensive cells in the body, so meaningless neural chatter can be costly. “There’s always the possibility that if we had the right stimulus to drive every neuron, it might fire more reliably,” says Pachitariu, but that hasn’t yet been shown. “It leaves us with a mystery that I think is worth pursuing further.”



Carsen Stringer, Michalis Michaelos, Dmitri Tsyboulski, Sarah E. Lindo, and Marius Pachitariu. "High precision coding in visual cortex," Cell, published online April 14, 2021. Doi:10.1016/j.cell.2021.03.042.