Technical Program

Paper Detail

Paper: PS-1B.76
Session: Poster Session 1B
Location: H Fl├Ąche 1.OG
Session Time: Saturday, September 14, 16:30 - 19:30
Presentation Time:Saturday, September 14, 16:30 - 19:30
Presentation: Poster
Publication: 2019 Conference on Cognitive Computational Neuroscience, 13-16 September 2019, Berlin, Germany
Paper Title: Noise correlations facilitate faster learning
Manuscript:  Click here to view manuscript
License: Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 Unported License.
DOI: https://doi.org/10.32470/CCN.2019.1379-0
Authors: Matthew Nassar, Brown University, United States
Abstract: One major challenge for AI is that, while deep neural networks are capable of achieving human level performance on a wide variety of tasks, they typically require a greater number of learning trials than would be required by a human. This issue has stimulated an interest in the inductive biases that humans and other animals employ to constrain learning in complex natural environments. While the neural mechanisms used to implement inductive biases could be informative for both improving AI and providing a better mechanistic understanding of learning, these neural underpinnings remain elusive. Here I explore the possibility that stimulus-independent pairwise correlations between neurons, or so-called noise correlations, might reflect inductive biases used to constrain learning to specific task-relevant dimensions. I test this idea with a neural network model of a two-alternative forced-choice perceptual discrimination task in which the correlation among similarly tuned units can be manipulated independently of the overall population signal-to-noise ratio. Higher noise correlations among similarly tuned units led to faster learning through weight adjustments that favored homogenous weights assigned to neurons within a functionally similar pool. Such noise correlations emerge naturally with Hebbian learning. These results suggest that noise correlations may serve to reduce the dimensionality of learning thereby making it more rapid and robust.