Guided image generation with conditional invertible neural networks

In this work, we address the task of natural image generation guided by a conditioning input. We introduce a new architecture called conditional invertible neural network (cINN). The cINN combines the purely generative INN model with an unconstrained feed-forward network, which efficiently preproces...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Ardizzone, Lynton (VerfasserIn) , Lüth, Carsten (VerfasserIn) , Kruse, Jakob (VerfasserIn) , Rother, Carsten (VerfasserIn) , Köthe, Ullrich (VerfasserIn)
Dokumenttyp: Article (Journal) Kapitel/Artikel
Sprache:Englisch
Veröffentlicht: 10 Jul 2019
In: Arxiv
Year: 2019, Pages: 1-11
DOI:10.48550/arXiv.1907.02392
Online-Zugang:Verlag, lizenzpflichtig, Volltext: https://doi.org/10.48550/arXiv.1907.02392
Verlag, lizenzpflichtig, Volltext: http://arxiv.org/abs/1907.02392
Volltext
Verfasserangaben:Lynton Ardizzone, Carsten Lüth, Jakob Kruse, Carsten Rother, Ullrich Köthe
Beschreibung
Zusammenfassung:In this work, we address the task of natural image generation guided by a conditioning input. We introduce a new architecture called conditional invertible neural network (cINN). The cINN combines the purely generative INN model with an unconstrained feed-forward network, which efficiently preprocesses the conditioning input into useful features. All parameters of the cINN are jointly optimized with a stable, maximum likelihood-based training procedure. By construction, the cINN does not experience mode collapse and generates diverse samples, in contrast to e.g. cGANs. At the same time our model produces sharp images since no reconstruction loss is required, in contrast to e.g. VAEs. We demonstrate these properties for the tasks of MNIST digit generation and image colorization. Furthermore, we take advantage of our bi-directional cINN architecture to explore and manipulate emergent properties of the latent space, such as changing the image style in an intuitive way.
Beschreibung:Gesehen am 19.07.2022
Beschreibung:Online Resource
DOI:10.48550/arXiv.1907.02392