Understanding event-generation networks via uncertainties
Following the growing success of generative neural networks in LHC simulations, the crucial question is how to control the networks and assign uncertainties to their event output. We show how Bayesian normalizing flow or invertible networks capture uncertainties from the training and turn them into...
Gespeichert in:
| Hauptverfasser: | , , , |
|---|---|
| Dokumenttyp: | Article (Journal) Kapitel/Artikel |
| Sprache: | Englisch |
| Veröffentlicht: |
October 4, 2021
|
| In: |
Arxiv
Year: 2021, Pages: 1-26 |
| DOI: | 10.48550/arXiv.2104.04543 |
| Online-Zugang: | Verlag, lizenzpflichtig, Volltext: https://doi.org/10.48550/arXiv.2104.04543 Verlag, lizenzpflichtig, Volltext: http://arxiv.org/abs/2104.04543 |
| Verfasserangaben: | Marco Bellagente, Manuel Haußmann, Michel Luchmann, and Tilman Plehn |
Search Result 1