GRUU-Net: Integrated convolutional and gated recurrent neural network for cell segmentation

Cell segmentation in microscopy images is a common and challenging task. In recent years, deep neural networks achieved remarkable improvements in the field of computer vision. The dominant paradigm in segmentation is using convolutional neural networks, less common are recurrent neural networks. In...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Wollmann, Thomas (VerfasserIn) , Gunkel, Manuel (VerfasserIn) , Erfle, Holger (VerfasserIn) , Rippe, Karsten (VerfasserIn) , Rohr, Karl (VerfasserIn)
Dokumenttyp: Article (Journal)
Sprache:Englisch
Veröffentlicht: 31 May 2019
In: Medical image analysis
Year: 2019, Jahrgang: 56, Pages: 68-79
ISSN:1361-8423
DOI:10.1016/j.media.2019.04.011
Online-Zugang:Verlag, Volltext: https://doi.org/10.1016/j.media.2019.04.011
Verlag: http://www.sciencedirect.com/science/article/pii/S1361841518306753
Volltext
Verfasserangaben:T. Wollmann, M. Gunkel, I. Chung, H. Erfle, K. Rippe, K. Rohr
Beschreibung
Zusammenfassung:Cell segmentation in microscopy images is a common and challenging task. In recent years, deep neural networks achieved remarkable improvements in the field of computer vision. The dominant paradigm in segmentation is using convolutional neural networks, less common are recurrent neural networks. In this work, we propose a new deep learning method for cell segmentation, which integrates convolutional neural networks and gated recurrent neural networks over multiple image scales to exploit the strength of both types of networks. To increase the robustness of the training and improve segmentation, we introduce a novel focal loss function. We also present a distributed scheme for optimized training of the integrated neural network. We applied our proposed method to challenging data of glioblastoma cell nuclei and performed a quantitative comparison with state-of-the-art methods. Insights on how our extensions affect training and inference are also provided. Moreover, we benchmarked our method using a wide spectrum of all 22 real microscopy datasets of the Cell Tracking Challenge.
Beschreibung:Gesehen am 21.10.2019
Beschreibung:Online Resource
ISSN:1361-8423
DOI:10.1016/j.media.2019.04.011