Physically consistent and efficient variational denoising of image fluid flow estimates

Imaging plays an important role in experimental fluid dynamics. It is equally important both for scientific research and a range of industrial applications. It is known, however, that estimated velocity fields of fluids often suffer from various types of corruptions like missing data, for instance,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Vlasenko, Andrey (VerfasserIn) , Schnörr, Christoph (VerfasserIn)
Dokumenttyp: Article (Journal)
Sprache:Englisch
Veröffentlicht: 2010
In: IEEE transactions on image processing
Year: 2010, Jahrgang: 19, Heft: 3, Pages: 586-595
ISSN:1941-0042
DOI:10.1109/TIP.2009.2036673
Online-Zugang:Resolving-System, lizenzpflichtig, Volltext: https://doi.org/10.1109/TIP.2009.2036673
Verlag, lizenzpflichtig, Volltext: https://ieeexplore.ieee.org/document/5339167/
Volltext
Verfasserangaben:Andrey Vlasenko and Christoph Schnörr
Beschreibung
Zusammenfassung:Imaging plays an important role in experimental fluid dynamics. It is equally important both for scientific research and a range of industrial applications. It is known, however, that estimated velocity fields of fluids often suffer from various types of corruptions like missing data, for instance, that make their physical interpretation questionable. We present an algorithm that accepts a wide variety of corrupted 2-D vector fields as input data and allows to recover missing data fragments and to remove noise in a physically plausible way. Our approach essentially exploits the physical properties of incompressible fluid flows and does not rely upon any particular model of noise. As a result, the developed algorithm performs well and robust for different types of noise and estimation errors. The computational algorithm is sufficiently simple to scale up to large 3-D problems.
Beschreibung:Online veröffentlicht am 20. November 2009, aktuelle Version am 18. February 2010 veröffentlicht
Gesehen am 07.09.2023
Beschreibung:Online Resource
ISSN:1941-0042
DOI:10.1109/TIP.2009.2036673