Reconstructing computational system dynamics from neural data with recurrent neural networks

Computational models in neuroscience usually take the form of systems of differential equations. The behaviour of such systems is the subject of dynamical systems theory. Dynamical systems theory provides a powerful mathematical toolbox for analysing neurobiological processes and has been a mainstay...

Full description

Saved in:
Bibliographic Details
Main Authors: Durstewitz, Daniel (Author) , Koppe, Georgia (Author) , Thurm, Max (Author)
Format: Article (Journal)
Language:English
Published: November 2023
In: Nature reviews. Neuroscience
Year: 2023, Volume: 24, Issue: 11, Pages: 693-710
ISSN:1471-0048
DOI:10.1038/s41583-023-00740-7
Online Access:Verlag, lizenzpflichtig, Volltext: https://doi.org/10.1038/s41583-023-00740-7
Verlag, lizenzpflichtig, Volltext: https://www.nature.com/articles/s41583-023-00740-7
Get full text
Author Notes:Daniel Durstewitz, Georgia Koppe & Max Ingo Thurm
Description
Summary:Computational models in neuroscience usually take the form of systems of differential equations. The behaviour of such systems is the subject of dynamical systems theory. Dynamical systems theory provides a powerful mathematical toolbox for analysing neurobiological processes and has been a mainstay of computational neuroscience for decades. Recently, recurrent neural networks (RNNs) have become a popular machine learning tool for studying the non-linear dynamics of neural and behavioural processes by emulating an underlying system of differential equations. RNNs have been routinely trained on similar behavioural tasks to those used for animal subjects to generate hypotheses about the underlying computational mechanisms. By contrast, RNNs can also be trained on the measured physiological and behavioural data, thereby directly inheriting their temporal and geometrical properties. In this way they become a formal surrogate for the experimentally probed system that can be further analysed, perturbed and simulated. This powerful approach is called dynamical system reconstruction. In this Perspective, we focus on recent trends in artificial intelligence and machine learning in this exciting and rapidly expanding field, which may be less well known in neuroscience. We discuss formal prerequisites, different model architectures and training approaches for RNN-based dynamical system reconstructions, ways to evaluate and validate model performance, how to interpret trained models in a neuroscience context, and current challenges.
Item Description:Veröffentlicht: 04. Oktober 2023
Gesehen am 20.11.2023
Physical Description:Online Resource
ISSN:1471-0048
DOI:10.1038/s41583-023-00740-7