A Lorentz-equivariant transformer for all of the LHC

We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields state-of-the-art performance for a wide range of machine learning tasks at the Large Hadron Collider. L-GATr represents data in a geometric algebra over space-time and is equivariant under Lorentz transformations. The...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Brehmer, Johann (VerfasserIn) , Bresó Pla, Víctor (VerfasserIn) , de Haan, Pim (VerfasserIn) , Plehn, Tilman (VerfasserIn) , Qu, Huilin (VerfasserIn) , Spinner, Jonas (VerfasserIn) , Thaler, Jesse (VerfasserIn)
Dokumenttyp: Article (Journal)
Sprache:Englisch
Veröffentlicht: 23 October 2025
In: SciPost physics
Year: 2025, Jahrgang: 19, Heft: 4, Pages: 1-30
ISSN:2542-4653
DOI:10.21468/SciPostPhys.19.4.108
Online-Zugang:Verlag, lizenzpflichtig, Volltext: https://doi.org/10.21468/SciPostPhys.19.4.108
Verlag, lizenzpflichtig, Volltext: https://scipost.org/10.21468/SciPostPhys.19.4.108
Volltext
Verfasserangaben:Johann Brehmer, Victor Bresó, Pim de Haan, Tilman Plehn, Huilin Qu, Jonas Spinner and Jesse Thaler
Beschreibung
Zusammenfassung:We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields state-of-the-art performance for a wide range of machine learning tasks at the Large Hadron Collider. L-GATr represents data in a geometric algebra over space-time and is equivariant under Lorentz transformations. The underlying architecture is a versatile and scalable transformer, which is able to break symmetries if needed. We demonstrate the power of L-GATr for amplitude regression and jet classification, and then benchmark it as the first Lorentz-equivariant generative network. For all three LHC tasks, we find significant improvements over previous architectures.
Beschreibung:Veröffentlicht: 23. Oktober 2025
Gesehen am 04.12.2025
Beschreibung:Online Resource
ISSN:2542-4653
DOI:10.21468/SciPostPhys.19.4.108