Deterministic networks for probabilistic computing

Neuronal network models of high-level brain functions such as memory recall and reasoning often rely on the presence of some form of noise. The majority of these models assumes that each neuron in the functional network is equipped with its own private source of randomness, often in the form of unco...

Full description

Saved in:
Bibliographic Details
Main Authors: Jordan, Jakob (Author) , Petrovici, Mihai A. (Author) , Breitwieser, Oliver (Author) , Schemmel, Johannes (Author) , Meier, Karlheinz (Author) , Diesmann, Markus (Author) , Tetzlaff, Tom (Author)
Format: Article (Journal)
Language:English
Published: 4 December 2019
In: Scientific reports
Year: 2019, Volume: 9
ISSN:2045-2322
DOI:10.1038/s41598-019-54137-7
Online Access:Verlag, Volltext: https://doi.org/10.1038/s41598-019-54137-7
Verlag, Volltext: https://www.nature.com/articles/s41598-019-54137-7
Get full text
Author Notes:Jakob Jordan, Mihai A. Petrovici, Oliver Breitwieser, Johannes Schemmel, Karlheinz Meier, Markus Diesmann & Tom Tetzlaff
Description
Summary:Neuronal network models of high-level brain functions such as memory recall and reasoning often rely on the presence of some form of noise. The majority of these models assumes that each neuron in the functional network is equipped with its own private source of randomness, often in the form of uncorrelated external noise. In vivo, synaptic background input has been suggested to serve as the main source of noise in biological neuronal networks. However, the finiteness of the number of such noise sources constitutes a challenge to this idea. Here, we show that shared-noise correlations resulting from a finite number of independent noise sources can substantially impair the performance of stochastic network models. We demonstrate that this problem is naturally overcome by replacing the ensemble of independent noise sources by a deterministic recurrent neuronal network. By virtue of inhibitory feedback, such networks can generate small residual spatial correlations in their activity which, counter to intuition, suppress the detrimental effect of shared input. We exploit this mechanism to show that a single recurrent network of a few hundred neurons can serve as a natural noise source for a large ensemble of functional networks performing probabilistic computations, each comprising thousands of units.
Item Description:Gesehen am 17.01.2020
Physical Description:Online Resource
ISSN:2045-2322
DOI:10.1038/s41598-019-54137-7