Computing the distance between unbalanced distributions: the flat metric

We provide an implementation to compute the flat metric in any dimension. The flat metric, also called dual bounded Lipschitz distance, generalizes the well-known Wasserstein distance $$W_1$$to the case that the distributions are of unequal total mass. Thus, our implementation adapts very well to ma...

Full description

Saved in:
Bibliographic Details
Main Authors: Schmidt, Henri (Author) , Düll, Christian (Author)
Format: Article (Journal)
Language:English
Published: 24 July 2025
In: Machine learning
Year: 2025, Volume: 114, Issue: 8, Pages: 1-34
ISSN:1573-0565
DOI:10.1007/s10994-025-06828-8
Online Access:Verlag, kostenfrei, Volltext: https://doi.org/10.1007/s10994-025-06828-8
Get full text
Author Notes:Henri Schmidt, Christian Düll
Description
Summary:We provide an implementation to compute the flat metric in any dimension. The flat metric, also called dual bounded Lipschitz distance, generalizes the well-known Wasserstein distance $$W_1$$to the case that the distributions are of unequal total mass. Thus, our implementation adapts very well to mass differences and uses them to distinguish between different distributions. This is of particular interest for unbalanced optimal transport tasks and for the analysis of data distributions where the sample size is important or normalization is not possible. The core of the method is based on a neural network to determine an optimal test function realizing the distance between two given measures. Special focus was put on achieving comparability of pairwise computed distances from independently trained networks. We tested the quality of the output in several experiments where ground truth was available as well as with simulated data.
Item Description:Veröffentlicht: 24. Juli 2025
Gesehen am 17.11.2025
Physical Description:Online Resource
ISSN:1573-0565
DOI:10.1007/s10994-025-06828-8