DelGrad: exact event-based gradients for training delays and weights on spiking neuromorphic hardware

Spiking neural networks (SNNs) inherently rely on the timing of signals for representing and processing information. Augmenting SNNs with trainable transmission delays, alongside synaptic weights, has recently shown to increase their accuracy and parameter efficiency. However, existing training meth...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Göltz, Julian (VerfasserIn) , Weber, Jimmy (VerfasserIn) , Kriener, Laura (VerfasserIn) , Billaudelle, Sebastian (VerfasserIn) , Lake, Peter (VerfasserIn) , Schemmel, Johannes (VerfasserIn) , Payvand, Melika (VerfasserIn) , Petrovici, Mihai A. (VerfasserIn)
Dokumenttyp: Article (Journal)
Sprache:Englisch
Veröffentlicht: 09 September 2025
In: Nature Communications
Year: 2025, Jahrgang: 16, Pages: 1-10
ISSN:2041-1723
DOI:10.1038/s41467-025-63120-y
Online-Zugang:Verlag, kostenfrei, Volltext: https://doi.org/10.1038/s41467-025-63120-y
Verlag, kostenfrei, Volltext: https://www.nature.com/articles/s41467-025-63120-y
Volltext
Verfasserangaben:Julian Göltz, Jimmy Weber, Laura Kriener, Sebastian Billaudelle, Peter Lake, Johannes Schemmel, Melika Payvand & Mihai A. Petrovici
Beschreibung
Zusammenfassung:Spiking neural networks (SNNs) inherently rely on the timing of signals for representing and processing information. Augmenting SNNs with trainable transmission delays, alongside synaptic weights, has recently shown to increase their accuracy and parameter efficiency. However, existing training methods to optimize such networks rely on discrete time, approximate gradients, and full access to internal variables such as membrane potentials. This limits their precision, efficiency, and suitability for neuromorphic hardware due to increased memory and I/O-bandwidth demands. Here, we propose DelGrad, an analytical, event-based training method to compute exact loss gradients for both weights and delays. Grounded purely in spike timing, DelGrad eliminates the need to track any other variables to optimize SNNs. We showcase this key advantage by implementing DelGrad on the BrainScaleS-2 mixed-signal neuromorphic platform. For the first time, we experimentally demonstrate the parameter efficiency, accuracy benefits, and stabilizing effect of adding delays to SNNs on noisy hardware. DelGrad thus provides a new way for training SNNs with delays on neuromorphic substrates, with substantial improvements over previous results.
Beschreibung:Gesehen am 24.02.2026
Beschreibung:Online Resource
ISSN:2041-1723
DOI:10.1038/s41467-025-63120-y