Relative entropy and mutual information in Gaussian statistical field theory

Relative entropy is a powerful measure of the dissimilarity between two statistical field theories in the continuum. In this work, we study the relative entropy between Gaussian scalar field theories in a finite volume with different masses and boundary conditions. We show that the relative entropy...

Full description

Saved in:
Bibliographic Details
Main Authors: Schröfl, Markus (Author) , Flörchinger, Stefan (Author)
Format: Article (Journal)
Language:English
Published: 17 December 2024
In: Annales Henri Poincaré
Year: 2024, Pages: 1-87
ISSN:1424-0661
DOI:10.1007/s00023-024-01522-2
Online Access:Verlag, kostenfrei, Volltext: https://doi.org/10.1007/s00023-024-01522-2
Get full text
Author Notes:Markus Schröfl and Stefan Floerchinger
Description
Summary:Relative entropy is a powerful measure of the dissimilarity between two statistical field theories in the continuum. In this work, we study the relative entropy between Gaussian scalar field theories in a finite volume with different masses and boundary conditions. We show that the relative entropy depends crucially on d, the dimension of Euclidean space. Furthermore, we demonstrate that the mutual information between two disjoint regions in $$\mathbb {R}^d$$is finite if the two regions are separated by a finite distance and satisfies an area law. We then construct an example of “touching” regions between which the mutual information is infinite. We argue that the properties of mutual information in scalar field theories can be explained by the Markov property of these theories.
Item Description:Gesehen am 07.03.2025
Physical Description:Online Resource
ISSN:1424-0661
DOI:10.1007/s00023-024-01522-2