DelGrad: exact event-based gradients for training delays and weights on spiking neuromorphic hardware

Spiking neural networks (SNNs) inherently rely on the timing of signals for representing and processing information. Augmenting SNNs with trainable transmission delays, alongside synaptic weights, has recently shown to increase their accuracy and parameter efficiency. However, existing training meth...

Full description

Saved in:
Bibliographic Details
Main Authors: Göltz, Julian (Author) , Weber, Jimmy (Author) , Kriener, Laura (Author) , Billaudelle, Sebastian (Author) , Lake, Peter (Author) , Schemmel, Johannes (Author) , Payvand, Melika (Author) , Petrovici, Mihai A. (Author)
Format: Article (Journal)
Language:English
Published: 09 September 2025
In: Nature Communications
Year: 2025, Volume: 16, Pages: 1-10
ISSN:2041-1723
DOI:10.1038/s41467-025-63120-y
Online Access:Verlag, kostenfrei, Volltext: https://doi.org/10.1038/s41467-025-63120-y
Verlag, kostenfrei, Volltext: https://www.nature.com/articles/s41467-025-63120-y
Get full text
Author Notes:Julian Göltz, Jimmy Weber, Laura Kriener, Sebastian Billaudelle, Peter Lake, Johannes Schemmel, Melika Payvand & Mihai A. Petrovici
Description
Summary:Spiking neural networks (SNNs) inherently rely on the timing of signals for representing and processing information. Augmenting SNNs with trainable transmission delays, alongside synaptic weights, has recently shown to increase their accuracy and parameter efficiency. However, existing training methods to optimize such networks rely on discrete time, approximate gradients, and full access to internal variables such as membrane potentials. This limits their precision, efficiency, and suitability for neuromorphic hardware due to increased memory and I/O-bandwidth demands. Here, we propose DelGrad, an analytical, event-based training method to compute exact loss gradients for both weights and delays. Grounded purely in spike timing, DelGrad eliminates the need to track any other variables to optimize SNNs. We showcase this key advantage by implementing DelGrad on the BrainScaleS-2 mixed-signal neuromorphic platform. For the first time, we experimentally demonstrate the parameter efficiency, accuracy benefits, and stabilizing effect of adding delays to SNNs on noisy hardware. DelGrad thus provides a new way for training SNNs with delays on neuromorphic substrates, with substantial improvements over previous results.
Item Description:Gesehen am 24.02.2026
Physical Description:Online Resource
ISSN:2041-1723
DOI:10.1038/s41467-025-63120-y